Contextual MCP Server
ContextualAI
README
Contextual MCP Server
A Model Context Protocol (MCP) server that provides RAG (Retrieval-Augmented Generation) capabilities using Contextual AI. This server integrates with a variety of MCP clients. In this readme, we will show integration with the both Cursor IDE and Claude Desktop.
Overview
This MCP server acts as a bridge between AI interfaces (Cursor IDE or Claude Desktop) and a specialized Contextual AI agent. It enables:
- Query Processing: Direct your domain specific questions to a dedicated Contextual AI agent
- Intelligent Retrieval: Searches through comprehensive information in your knowledge base
- Context-Aware Responses: Generates answers that are:
- Grounded in source documentation
- Include citations and attributions
- Maintain conversation context
Integration Flow
Cursor/Claude Desktop → MCP Server → Contextual AI RAG Agent
↑ ↓ ↓
└──────────────────┴─────────────┴─────────────── Response with citations
Prerequisites
- Python 3.10 or higher
- Cursor IDE and/or Claude Desktop
- Contextual AI API key
- MCP-compatible environment
Installation
- Clone the repository:
git clone https://github.com/ContextualAI/contextual-mcp-server.git
cd contextual-mcp-server
- Create and activate a virtual environment:
python -m venv .venv
source .venv/bin/activate # On Windows, use `.venv\Scripts\activate`
- Install dependencies:
pip install -e .
Configuration
Environment Variables
The server requires the following environment variables:
API_KEY
: Your Contextual AI API keyAGENT_ID
: Your Contextual AI agent ID
If you'd like to store these files in .env
file you can specify them like so:
cat > .env << EOF
API_KEY=key...
AGENT_ID=...
EOF
AI Interface Integration
This MCP server can be integrated with either Cursor IDE or Claude Desktop using the same configuration approach. Create or modify the MCP configuration file in the appropriate location:
- First, find the path to your
uv
installation:
UV_PATH=$(which uv)
echo $UV_PATH
# Example output: /Users/username/miniconda3/bin/uv
- Create the configuration file using the full path from step 1:
cat > mcp.json << EOF
{
"mcpServers": {
"ContextualAI-TechDocs": {
"command": "$UV_PATH", # make sure this is set properly
"args": [
"--directory",
"\${workspaceFolder}", # Will be replaced with your project path
"run",
"multi-agent/server.py"
]
}
}
}
EOF
- Move to the correct folder location, see below for options:
mkdir -p .cursor/
mv mcp.json .cursor/
Configuration locations:
- For Cursor:
- Project-specific:
.cursor/mcp.json
in your project directory - Global:
~/.cursor/mcp.json
for system-wide access - For Claude Desktop:
- Use the same configuration file format in the appropriate Claude Desktop configuration directory
Environment Setup
This project uses uv
for dependency management, which provides faster and more reliable Python package installation.
Usage
The server provides Contextual AI RAG capabilities using the python SDK, which can available a variety of commands accessible from MCP clients, such as Cursor IDE and Claude Desktop. The current server focuses on using the query command from the Contextual AI python SDK, however you could extend this to support other features such as listing all the agents, updating retrieval settings, updating prompts, extracting retrievals, or downloading metrics.
Example Usage
# In Cursor, you might ask:
"Show me the code for initiating the RF345 microchip?"
# The MCP server will:
1. Route the query to the Contextual AI agent
2. Retrieve relevant documentation
3. Generate a response with specific citations
4. Return the formatted answer to Cursor
Key Benefits
- Accurate Responses: All answers are grounded in your documentation
- Source Attribution: Every response includes references to source documents
- Context Awareness: The system maintains conversation context for follow-up questions
- Real-time Updates: Responses reflect the latest documentation in your datastore
Development
Project Structure
contextual-mcp-server/
├── server.py # Main MCP server implementation
├── pyproject.toml # Project dependencies and metadata
└── README.md # Documentation
Modifying the Server
To add new capabilities:
- Add new tools by creating additional functions decorated with
@mcp.tool()
- Define the tool's parameters using Python type hints
- Provide a clear docstring describing the tool's functionality
Example:
@mcp.tool()
def new_tool(param: str) -> str:
"""Description of what the tool does"""
# Implementation
return result
Technical Details
- Transport: stdio (local execution)
- Protocol: Model Context Protocol (MCP)
Limitations
- The server runs locally and may not work in remote development environments
- Tool responses are subject to Contextual AI API limits and quotas
- Currently only supports stdio transport mode
For all the capabilities of Contextual AI, please check the official documentation.
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
MCP Package Docs Server
Facilitates LLMs to efficiently access and fetch structured documentation for packages in Go, Python, and NPM, enhancing software development with multi-language support and performance optimization.
Claude Code MCP
An implementation of Claude Code as a Model Context Protocol server that enables using Claude's software engineering capabilities (code generation, editing, reviewing, and file operations) through the standardized MCP interface.
@kazuph/mcp-taskmanager
Model Context Protocol server for Task Management. This allows Claude Desktop (or any MCP client) to manage and execute tasks in a queue-based system.
Linear MCP Server
Enables interaction with Linear's API for managing issues, teams, and projects programmatically through the Model Context Protocol.
mermaid-mcp-server
A Model Context Protocol (MCP) server that converts Mermaid diagrams to PNG images.
Jira-Context-MCP
MCP server to provide Jira Tickets information to AI coding agents like Cursor

Linear MCP Server
A Model Context Protocol server that integrates with Linear's issue tracking system, allowing LLMs to create, update, search, and comment on Linear issues through natural language interactions.

Sequential Thinking MCP Server
This server facilitates structured problem-solving by breaking down complex issues into sequential steps, supporting revisions, and enabling multiple solution paths through full MCP integration.