MCP-RAGNAR
A local RAG server that enables document indexing and sentence window retrieval across multiple file formats like PDF, MD, and DOCX. It supports both local Hugging Face models and OpenAI embeddings for efficient context-aware querying through the Model Context Protocol.
README
MCP-RAGNAR - a local RAG MCP Server
A local MCP server that implements RAG (Retrieval-Augmented Generation) with sentence window retrieval.
Features
- Document indexing with support for multiple file types (txt, md, pdf, doc, docx)
- Sentence window retrieval for better context understanding
- Configurable embedding models (OpenAI or local hugging face mode - i.e BAAI/bge-large-en-v1.5)
- MCP server integration for easy querying
Requirements
- Python 3.10+
- UV package manager
Installation
- Clone the repository:
git clone <repository-url>
cd mcp-ragnar
- Install dependencies using UV:
uv pip install -e .
Usage
Indexing Documents
You can index documents either programmatically or via the command line.
Indexing
python -m indexer.index /path/to/documents /path/to/index
# to change the default local embedding model and chunk size
python -m indexer.index /path/to/documents /path/to/index --chunk-size=512 --embed-model BAAI/bge-small-en-v1.5
# With OpenAI embedding endpoint (put your OPENAI_API_KEY in env)
python -m indexer.index /path/to/documents /path/to/index --embed-endpoint https://api.openai.com/v1 --embed-model text-embedding-3-small --tokenizer-model o200k_base
# Get help
python -m indexer.index --help
Running the MCP Server
Configuration
can be supplied as env var or .env file
EMBED_ENDPOINT: (Optional) Path to an OpenAI compatible embedding endpoint (ends with /v1). If not set, a local Hugging Face model is used by default.EMBED_MODEL: (Optional) Name of the embedding model to use. Default value of BAAI/bge-large-en-v1.5.INDEX_ROOT: The root directory for the index, used by the retriever. This is mandatory for MCP (Multi-Cloud Platform) querying.MCP_DESCRIPTION: The exposed name and description for the MCP server, used for MCP querying only. This is mandatory for MCP querying. For example: "RAG to my local personal documents"INDEX_ROOT: the root path of the index
in SSE mode it will listen to http://localhost:8001/ragnar
python server/sse.py
in stdio mode
install locally as an uv tool
uv tool install .
Claude Desktop:
Update the following:
On MacOS: ~/Library/Application\ Support/Claude/claude_desktop_config.json
On Windows: %APPDATA%/Claude/claude_desktop_config.json
Example :
{
"mcpServers": {
"mcp-ragnar": {
"command": "uvx",
"args": [
"mcp-ragnar"
],
"env": {
"OPENAI_API_KEY": "",
"EMBED_ENDPOINT": "https://api.openai.com/v1",
"EMBED_MODEL": "text-embedding-3-small",
"MCP_DESCRIPTION": "My local Rust documentation",
"INDEX_ROOT": "/tmp/index"
}
}
}
}
License
GNU General Public License v3.0
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
E2B
Using MCP to run code via e2b.