π§ Advanced MCP Server Setup with
Advanced MCP Server Setup with uv, llama-index, ollama, and Cursor IDE
sidhyaashu
README
π§ Advanced MCP Server Setup with uv, llama-index, ollama, and Cursor IDE
β Prerequisites
- [x] Python 3.10+ installed
- [x] uv (by Astral) installed globally (
pip install uv) - [x] Ollama installed and running locally
- [x] Cursor IDE installed
π Step 1: Project Setup
1.1 Create a New Project Directory
uv init mcp-server
cd mcp-server
1.2 Create and Activate Virtual Environment
uv venv
.venv\Scripts\activate # On Windows
# OR
source .venv/bin/activate # On Linux/Mac
π Step 2: Environment Configuration
Create a .env file in the root of your project and add your API key:
LINKUP_API_KEY=your_api_key_here
π¦ Step 3: Install Required Dependencies
Run these commands one by one inside your virtual environment:
# Core MCP CLI and HTTP utilities
uv add mcp[cli] httpx
# Linkup SDK for orchestrating agents
uv add linkup-sdk
# LlamaIndex integrations
uv add llama-index
uv add llama-index-embeddings-huggingface
uv add llama-index-llms-ollama
# Optional: for using notebooks
uv add ipykernel
π§ͺ Step 4: Confirm Installation
After installation, check your uv-managed pyproject.toml for something like this:
[tool.uv.dependencies]
mcp = { extras = ["cli"] }
httpx = "*"
linkup-sdk = "*"
llama-index = "*"
llama-index-embeddings-huggingface = "*"
llama-index-llms-ollama = "*"
ipykernel = "*"
βοΈ Step 5: Create a Minimal Server Entry Point
Create a server.py file inside the project root:
# server.py
from mcp.cli import app
if __name__ == "__main__":
app()
You can later replace this with your own
FastMCPor Agent orchestrator script.
π§ Step 6: Run Ollama Locally
Make sure Ollama is installed and running:
ollama run llama3.2 Or any model you want
This starts the LLM backend at http://localhost:11434.
π₯οΈ Step 7: Configure MCP Server in Cursor IDE
7.1 Open Cursor Settings
- Open
Settingsβ Go to MCP section. - Click on "Add New Global MCP Server"
7.2 Fill Out the Configuration
Replace the paths with your actual machine paths. You can get the full path to uv by running:
where uv # Windows
Now add this to your Cursor IDE settings:
{
"mcpServers": {
"weather": {
"command": "C:\\Users\\SIDHYA\\AppData\\Roaming\\Python\\Python311\\Scripts\\uv.exe", // Replace with your actual uv path
"args": [
"--directory",
"C:\\Users\\SIDHYA\\Development\\Ai\\mcp-server",
"run",
"server.py"
]
}
}
}
π§ͺ Step 8: Test the Integration
- Open any
.pyfile in Cursor. - Use the MCP tools (usually accessible via
βKorCtrl+K) to run the βweatherβ MCP server. - You should see the server spin up using your
server.py.
π Suggested Directory Structure
mcp-server/
βββ .env
βββ pyproject.toml
βββ server.py
βββ rag.py
π Keep Things Updated
To update dependencies:
uv pip install --upgrade llama-index
uv pip install --upgrade linkup-sdk
βοΈ Author
π Hey, I'm Asutosh Sidhya
π Connect with Me
- π§ Email: sidhyaasutosh@gmail.com
- π§βπ» GitHub: @asutosh7
- πΌ LinkedIn: linkedin.com/in/asutosh-sidhya
If you're building something around AI agents, local LLMs, or automated RAG pipelinesβI'd love to connect or collaborate!
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
@kazuph/mcp-taskmanager
Model Context Protocol server for Task Management. This allows Claude Desktop (or any MCP client) to manage and execute tasks in a queue-based system.
Claude Code MCP
An implementation of Claude Code as a Model Context Protocol server that enables using Claude's software engineering capabilities (code generation, editing, reviewing, and file operations) through the standardized MCP interface.
MCP Package Docs Server
Facilitates LLMs to efficiently access and fetch structured documentation for packages in Go, Python, and NPM, enhancing software development with multi-language support and performance optimization.
Linear MCP Server
A Model Context Protocol server that integrates with Linear's issue tracking system, allowing LLMs to create, update, search, and comment on Linear issues through natural language interactions.
Sequential Thinking MCP Server
This server facilitates structured problem-solving by breaking down complex issues into sequential steps, supporting revisions, and enabling multiple solution paths through full MCP integration.
mermaid-mcp-server
A Model Context Protocol (MCP) server that converts Mermaid diagrams to PNG images.
Jira-Context-MCP
MCP server to provide Jira Tickets information to AI coding agents like Cursor
Linear MCP Server
Enables interaction with Linear's API for managing issues, teams, and projects programmatically through the Model Context Protocol.