🧠 Advanced MCP Server Setup with
Advanced MCP Server Setup with uv, llama-index, ollama, and Cursor IDE
sidhyaashu
README
🧠 Advanced MCP Server Setup with uv
, llama-index
, ollama
, and Cursor IDE
✅ Prerequisites
- [x] Python 3.10+ installed
- [x] uv (by Astral) installed globally (
pip install uv
) - [x] Ollama installed and running locally
- [x] Cursor IDE installed
🛠 Step 1: Project Setup
1.1 Create a New Project Directory
uv init mcp-server
cd mcp-server
1.2 Create and Activate Virtual Environment
uv venv
.venv\Scripts\activate # On Windows
# OR
source .venv/bin/activate # On Linux/Mac
🔐 Step 2: Environment Configuration
Create a .env
file in the root of your project and add your API key:
LINKUP_API_KEY=your_api_key_here
📦 Step 3: Install Required Dependencies
Run these commands one by one inside your virtual environment:
# Core MCP CLI and HTTP utilities
uv add mcp[cli] httpx
# Linkup SDK for orchestrating agents
uv add linkup-sdk
# LlamaIndex integrations
uv add llama-index
uv add llama-index-embeddings-huggingface
uv add llama-index-llms-ollama
# Optional: for using notebooks
uv add ipykernel
🧪 Step 4: Confirm Installation
After installation, check your uv
-managed pyproject.toml
for something like this:
[tool.uv.dependencies]
mcp = { extras = ["cli"] }
httpx = "*"
linkup-sdk = "*"
llama-index = "*"
llama-index-embeddings-huggingface = "*"
llama-index-llms-ollama = "*"
ipykernel = "*"
⚙️ Step 5: Create a Minimal Server Entry Point
Create a server.py
file inside the project root:
# server.py
from mcp.cli import app
if __name__ == "__main__":
app()
You can later replace this with your own
FastMCP
or Agent orchestrator script.
🧠 Step 6: Run Ollama Locally
Make sure Ollama is installed and running:
ollama run llama3.2 Or any model you want
This starts the LLM backend at http://localhost:11434
.
🖥️ Step 7: Configure MCP Server in Cursor IDE
7.1 Open Cursor Settings
- Open
Settings
→ Go to MCP section. - Click on "Add New Global MCP Server"
7.2 Fill Out the Configuration
Replace the paths with your actual machine paths. You can get the full path to uv
by running:
where uv # Windows
Now add this to your Cursor IDE settings:
{
"mcpServers": {
"weather": {
"command": "C:\\Users\\SIDHYA\\AppData\\Roaming\\Python\\Python311\\Scripts\\uv.exe", // Replace with your actual uv path
"args": [
"--directory",
"C:\\Users\\SIDHYA\\Development\\Ai\\mcp-server",
"run",
"server.py"
]
}
}
}
🧪 Step 8: Test the Integration
- Open any
.py
file in Cursor. - Use the MCP tools (usually accessible via
⌘K
orCtrl+K
) to run the “weather” MCP server. - You should see the server spin up using your
server.py
.
📘 Suggested Directory Structure
mcp-server/
├── .env
├── pyproject.toml
├── server.py
└── rag.py
🔁 Keep Things Updated
To update dependencies:
uv pip install --upgrade llama-index
uv pip install --upgrade linkup-sdk
✍️ Author
👋 Hey, I'm Asutosh Sidhya
🌐 Connect with Me
- 📧 Email: sidhyaasutosh@gmail.com
- 🧑💻 GitHub: @asutosh7
- 💼 LinkedIn: linkedin.com/in/asutosh-sidhya
If you're building something around AI agents, local LLMs, or automated RAG pipelines—I'd love to connect or collaborate!
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
MCP Package Docs Server
Facilitates LLMs to efficiently access and fetch structured documentation for packages in Go, Python, and NPM, enhancing software development with multi-language support and performance optimization.
Claude Code MCP
An implementation of Claude Code as a Model Context Protocol server that enables using Claude's software engineering capabilities (code generation, editing, reviewing, and file operations) through the standardized MCP interface.
@kazuph/mcp-taskmanager
Model Context Protocol server for Task Management. This allows Claude Desktop (or any MCP client) to manage and execute tasks in a queue-based system.
Linear MCP Server
Enables interaction with Linear's API for managing issues, teams, and projects programmatically through the Model Context Protocol.
mermaid-mcp-server
A Model Context Protocol (MCP) server that converts Mermaid diagrams to PNG images.
Jira-Context-MCP
MCP server to provide Jira Tickets information to AI coding agents like Cursor

Linear MCP Server
A Model Context Protocol server that integrates with Linear's issue tracking system, allowing LLMs to create, update, search, and comment on Linear issues through natural language interactions.

Sequential Thinking MCP Server
This server facilitates structured problem-solving by breaking down complex issues into sequential steps, supporting revisions, and enabling multiple solution paths through full MCP integration.