RAG MCP Server
Enables Claude to perform retrieval-augmented generation using LangChain, ChromaDB, and HuggingFace models for domain-aware reasoning with PDF embedding, smart retrieval, reranking, and citation-based responses.
README
<a id = "top"></a>
TL;DR:
- This project implements a Retrieval-Augmented Generation (RAG) MCP Server using LangChain wrappers for ChromaDB and Hugging Face models.
- Designed for seamless integration with Claude Desktop and Cursor IDE as the MCP client.
- Uses a single persistent Chroma vector database with multiple collections (domains).
- Automatically retrieves and ranks the most relevant context for Claude, enabling domain-aware reasoning and citation-based responses.
- <a href = "#project-overview">Project Overview</a>
- Workflow
- <a href = "#features">Features</a>
- <a href = "#getting-started">Getting Started</a>
- Prerequisites
- Installation
- <a href = "#integrations">Integrations</a>
- Claude Desktop Integration
- Cursor IDE Integration
- <a href = "#mcp-inspector">MCP Inspector</a>
- <a href = "#tools">Available Tools</a>
- <a href = "#project-structure">Project Structure</a>
- <a href = "#references">References</a>
- <a href = "#license">License</a>
<!-- <div align = "center">
<a href = "#project-overview"> <kbd> <br> Project Overview <br> </kbd> </a>
<a href = "#features"> <kbd> <br> Features <br> </kbd> </a>
<a href = "TOOLS.md"> <kbd> <br> Available Tools <br> </kbd> </a>
<a href = "#getting-started"> <kbd> <br> Getting Started <br> </kbd> </a>
<a href = "#claude-integration"> <kbd> <br> Claude Integration <br> </kbd> </a>
<a href = "#project-structure"> <kbd> <br> Project Structure <br> </kbd> </a>
</div> -->
<a id = "project-overview"></a>
This project implements a LangChain-powered Retrieval-Augmented Generation (RAG) pipeline hosted as a FastMCP server for integration with Claude Desktop and Cursor IDE.
It uses:
langchain_chroma.Chromafor persistent, domain-based vector stores.langchain_huggingface.HuggingFaceEmbeddingsfor local or HuggingFace embedding models.langchain_community.cross_encoders.HuggingFaceCrossEncoderfor local or HuggingFace reranking models for better relevance tracking.FastMCP— a lightweight Python interface (built on FastAPI) that exposes LangChain-based retrieval tools to any MCP client such as Claude Desktop or Cursor IDE.
Each Chroma collection represents a distinct knowledge domain or document. Claude queries are routed to the appropriate collection, which retrieves top-k results and returns relevant context and citations.
⚡Workflow:
<div align = "center">
flowchart TD
Claude[Claude Desktop]
MCP[MCP Server: FastMCP + LangChain]
LangChain[LangChain Wrappers → ChromaDB + HuggingFace]
Claude --> MCP --> LangChain --> Claude
</div>
<div align="right"> <a href="#top"><kbd> <br> 🡅 Back to Top <br> </kbd></a> </div>
<a id = "features"></a>
- PDF Embedding: Add PDFs locally or via URL directly into a chosen collection.
- Smart Retrieval: Retrieve context chunks per collection or across multiple collections.
- Reranking Support: Uses a HuggingFace cross-encoder reranker for better document relevance.
- Document Management: List, rename, and inspect metadata for locally stored documents.
- Collection Management: Create, list, and delete ChromaDB collections dynamically.
- Citation Provider: Citations are generated from document metadata (e.g., page numbers, source document and path, etc.).
- Self-Describing Tools:
describeTools()lists all available MCP tools dynamically for introspection.
<div align="right"> <a href="#top"><kbd> <br> 🡅 Back to Top <br> </kbd></a> </div>
<a id = "tools"></a>
This MCP server exposes a set of tools that can be invoked by MCP Client to perform document and collection operations — including embedding, retrieval, metadata management, and citation generation.
For a full list of available tools, their arguments, and example usage, see the dedicated documentation:
View All Tools → TOOLS.md
<div align="right"> <a href="#top"><kbd> <br> 🡅 Back to Top <br> </kbd></a> </div>
<a id = "getting-started"></a>
🔧 Prerequisites
[!IMPORTANT]
⚙️ Installation
- Create and Activate Conda Environment
conda create -n MCP python=3.11.13 -y
conda activate MCP
- Clone the Repository
git clone https://github.com/NSANTRA/RAG-MCP-Server.git
cd RAG-MCP-Server
- Install Dependencies
pip install -r requirements.txt
- Configure .env
# Example:
# If your system has Nvidia GPU CUDA Toolkit setup, you can set the device to cuda, otherwise set it to cpu
DEVICE = "cuda"
DOCUMENT_DIR = "C:/Users/<yourusername>/Projects/RAG-MCP-Server/Documents"
CHROMA_DB_PERSIST_DIR = "C:/Users/<yourusername>/Projects/RAG-MCP-Server/Databases"
EMBEDDING_MODEL = "C:/Users/<yourusername>/Projects/RAG-MCP-Server/Models/MiniLM"
RERANKER_MODEL = "C:/Users/<yourusername>/Projects/RAG-MCP-Server/Models/MiniLM-Reranker"
[!CAUTION] You need to mention the absolute path wherever needed.
[!TIP]
- The above mentioned configuration uses local downloaded models. You can download the models using the Download Model.py python script. Change the models, if needed.
- You can swap the embedding or reranker paths for any HuggingFace models.
<div align="right"> <a href="#top"><kbd> <br> 🡅 Back to Top <br> </kbd></a> </div>
<a id = "integrations"></a>
[!IMPORTANT] You need to download the Claude Desktop app or Cursor IDE in order to run the MCP Server as it needs a MCP Client. You can download:
The above mentioned MCP clients automatically launches the RAG MCP Server when it’s registered in the MCP configuration file.
You do not need to run the Python script manually.
Claude Desktop Integration
🛠️ Setup Instructions
- Add the following entry to your Claude MCP configuration file (typically located in your Claude Desktop settings folder).
- You can find the mcp configuration file here: Settings → Developer → Edit Config to open the file.
- Then, add the following JSON config:
{
"mcpServers": {
"RAG": {
"command": "C:/Users/<yourusername>/anaconda3/envs/MCP/python.exe",
"args": ["<absolute to the Main.py>"],
"options": {
"cwd": "absolute project root directory path"
}
}
}
}
⚠️ Common Issue: If Claude fails to start the MCP server, ensure that:
- The Python path points to your Conda environment’s executable.
Main.pyhas no syntax errors and dependencies are installed.- The
cwdoption matches your project root directory.
Cursor IDE Integration
🛠️ Setup Instructions
- Open your project in Cursor IDE and go to File → Preferences → Cursor Setting → Tool & MCP → New MCP Server to open your MCP configuration file.
- Add the following JSON entry under the "mcpServers" section (adjusting paths as needed):
{
"mcpServers": {
"RAG": {
"command": "C:/Users/<yourusername>/anaconda3/envs/MCP/python.exe",
"args": ["<absolute to the Main.py>"],
"options": {
"cwd": "absolute project root directory path"
}
}
}
}
<div align="right"> <a href="#top"><kbd> <br> 🡅 Back to Top <br> </kbd></a> </div>
<a id = "mcp-inspector"></a>
[!TIP] MCP Inspector is an official developer tool from Anthropic that lets you test, debug, and inspect any Model Context Protocol (MCP) server — including custom RAG MCP servers — without requiring Claude Desktop or Cursor IDE.
[!IMPORTANT]
- To use MCP Inspector, you must have Node.js installed.
- During installation, enable “Add to PATH.”
- Verify your installation with
node -v,npm -vandnpx -v.
What It Does
- Lets you call tools interactively and see raw JSON input/output.
- Displays system logs, server metadata, and protocol messages.
- Ideal for testing new tool definitions or debugging retrieval workflows.
Installation
You can install MCP Inspector globally using npm:
npm install -g @modelcontextprotocol/inspector
Or run directly with npx (no install needed):
npx @modelcontextprotocol/inspector
Usage
- Navigate to your project root directory where Main.py is located.
- Launch your MCP server via the Inspector:
npx @modelcontextprotocol/inspector python Main.py
[!TIP] (If using a Conda environment, replace python with its full path. Or, first activate the environment, and use the above command as it is.)
- The Inspector will open a local web interface (usually at http://localhost:6274
) showing:
- Input/output schemas
- Real-time logs and response traces
<div align="right"> <a href="#top"><kbd> <br> 🡅 Back to Top <br> </kbd></a> </div>
<a id = "project-structure"></a>
├── Main.py # Entry point - starts the FastMCP server
│
├── Modules/
│ ├── Config.py # Loads env vars, sets up embeddings & reranker
│ ├── Core.py # Document-level utilities (metadata, citation, rename)
│ ├── Database.py # ChromaDB logic for embedding/retrieval
│ ├── Utils.py # Helper functions (file ops, reranking)
│ └── ToolDefinition.py # MCP tool manifests and argument schemas
│
├── .env # Environment configuration
├── requirements.txt # Dependencies
└── README.md
<div align="right"> <a href="#top"><kbd> <br> 🡅 Back to Top <br> </kbd></a> </div>
<a id = "references"></a>
-
LangChain RAG Workflow <br> LangChain Documentation — RAG
-
Chroma Vector Database <br> Chroma Docs
-
HuggingFace Embeddings and Cross-Encoders <br> Sentence Transformers <br> Cross-Encoder Models
-
Anthropic MCP & Claude Desktop <br> Model Context Protocol Official Site <br> Claude Desktop Overview
<div align="right"> <a href="#top"><kbd> <br> 🡅 Back to Top <br> </kbd></a> </div>
<a id = "license"></a>
MIT License
Copyright (c) 2025 Neelotpal Santra
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
<div align="right"> <a href="#top"><kbd> <br> 🡅 Back to Top <br> </kbd></a> </div>
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
E2B
Using MCP to run code via e2b.