Financial Analyst MCP Server
Enables interactive stock market analysis using local Deepseek-R1 model through Ollama. Supports querying stock data, generating visualizations, and performing financial analysis via natural language commands in Cursor IDE or terminal.
README
📈 MCP-powered Financial Analyst using Local Language Model Support
This project implements a financial analysis agent powered using a locally hosted language model (LLM) like Deepseek‑R1 model. It allows you to interactively query and analyze stock market data, with results displayed either:
- Inside Cursor IDE via MCP, or
- Directly in your terminal using a standalone Python MCP. inside Cursor IDE.
Features:
- Multi‑agent orchestration
- Local LLM inference with Ollama and Deepseek‑R1
- MCP integration conversational analysis
- Data fetching and visualization using yfinance, pandas, and matplotlib
Technologies Used:
- Ollama – serves the Deepseek-R1 model locally
- yfinance – stock market data
- pandas – data processing
- matplotlib – chart rendering
- MCP – conversational commands (Cursor IDE or terminal)
Setup and installations
Install Ollama
# Setting up Ollama on linux
curl -fsSL https://ollama.com/install.sh | sh
# Pull the Deepseek-R1 model
ollama pull deepseek-r1
Install Dependencies
Ensure you have Python 3.12 or later installed.
You can use uv to directly install the required dependencies (recommended).
uv sync
Or you can also use pip to install the following dependencies to your local environment.
pip install ollama mcp pydantic yfinance pandas matplotlib
Running the project
You have two options for running the MCP server.
Option A: Using Cursor IDE MCP
Set up your MCP server as follows:
- Go to Cursor settings
- Select MCP
- Add new global MCP server.
In the JSON file, add this:
{
"mcpServers": {
"financial-analyst": {
"command": "uv",
"args": [
"--directory",
"absolute/path/to/project_root",
"run",
"server.py"
]
}
}
}
You should now be able to see the MCP server listed in the MCP settings.
In Cursor MCP settings make sure to toggle the button to connect the server to the host. Done! Your server is now up and running.
You can now chat with Cursor and analyze stock market data. Simply provide the stock symbol and timeframe you want to analyze, and watch the magic unfold.
Option B: Using Python MCP (standalone terminal)
If you prefer not to use Cursor, simply run the server directly in your terminal:
python server.py
You will see a prompt asking for:
- Stock symbol
- Timeframe (e.g., 3mo, 1y)
Charts will be displayed in a window, and summaries printed in the terminal.
Example queries:
- "Show me Tesla's stock performance over the last 3 months"
- "Compare Apple and Microsoft stocks for the past year"
- "Analyze the trading volume of Amazon stock for the last month"
The agent will fetch data, run analysis, and render tables or charts inline.
Project Structure Overview
project_root/
├── server.py # Main entry point for the agent
├── requirements.txt # Optional, or use uv's pyproject.toml
└── README.md # You are reading this!
** Tips**
-
Troubleshooting Ollama: Make sure the Ollama daemon is running (ollama serve) before you start the server.
-
Customizing the prompt: You can easily edit server.py to adjust the prompt format or add more data processing.
Contribution
Contributions are welcome! Please fork the repository and submit a pull request with your improvements.
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.