
MCP-RAG Server
Implements Retrieval-Augmented Generation (RAG) using GroundX and OpenAI, allowing users to ingest documents and perform semantic searches with advanced context handling through Modern Context Processing (MCP).
README
MCP-RAG: Model Context Protocol with RAG 🚀
A powerful and efficient RAG (Retrieval-Augmented Generation) implementation using GroundX and OpenAI, built with Modern Context Processing (MCP).
🌟 Features
- Advanced RAG Implementation: Utilizes GroundX for high-accuracy document retrieval
- Model Context Protocol: Seamless integration with MCP for enhanced context handling
- Type-Safe: Built with Pydantic for robust type checking and validation
- Flexible Configuration: Easy-to-customize settings through environment variables
- Document Ingestion: Support for PDF document ingestion and processing
- Intelligent Search: Semantic search capabilities with scoring
🛠️ Prerequisites
- Python 3.12 or higher
- OpenAI API key
- GroundX API key
- MCP CLI tools
📦 Installation
- Clone the repository:
git clone <repository-url>
cd mcp-rag
- Create and activate a virtual environment:
uv sync
source .venv/bin/activate # On Windows, use `.venv\Scripts\activate`
⚙️ Configuration
- Copy the example environment file:
cp .env.example .env
- Configure your environment variables in
.env
:
GROUNDX_API_KEY="your-groundx-api-key"
OPENAI_API_KEY="your-openai-api-key"
BUCKET_ID="your-bucket-id"
🚀 Usage
Starting the Server
Run the inspect server using:
mcp dev server.py
Document Ingestion
To ingest new documents:
from server import ingest_documents
result = ingest_documents("path/to/your/document.pdf")
print(result)
Performing Searches
Basic search query:
from server import process_search_query
response = process_search_query("your search query here")
print(f"Query: {response.query}")
print(f"Score: {response.score}")
print(f"Result: {response.result}")
With custom configuration:
from server import process_search_query, SearchConfig
config = SearchConfig(
completion_model="gpt-4",
bucket_id="custom-bucket-id"
)
response = process_search_query("your query", config)
📚 Dependencies
groundx
(≥2.3.0): Core RAG functionalityopenai
(≥1.75.0): OpenAI API integrationmcp[cli]
(≥1.6.0): Modern Context Processing toolsipykernel
(≥6.29.5): Jupyter notebook support
🔒 Security
- Never commit your
.env
file containing API keys - Use environment variables for all sensitive information
- Regularly rotate your API keys
- Monitor API usage for any unauthorized access
🤝 Contributing
- Fork the repository
- Create your feature branch (
git checkout -b feature/amazing-feature
) - Commit your changes (
git commit -m 'Add some amazing feature'
) - Push to the branch (
git push origin feature/amazing-feature
) - Open a Pull Request
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.