RagDocs MCP Server
Provides RAG capabilities for semantic document search using Qdrant vector database and Ollama/OpenAI embeddings, allowing users to add, search, list, and delete documentation with metadata support.
heltonteixeira
README
RagDocs MCP Server
A Model Context Protocol (MCP) server that provides RAG (Retrieval-Augmented Generation) capabilities using Qdrant vector database and Ollama/OpenAI embeddings. This server enables semantic search and management of documentation through vector similarity.
Features
- Add documentation with metadata
- Semantic search through documents
- List and organize documentation
- Delete documents
- Support for both Ollama (free) and OpenAI (paid) embeddings
- Automatic text chunking and embedding generation
- Vector storage with Qdrant
Prerequisites
- Node.js 16 or higher
- One of the following Qdrant setups:
- Local instance using Docker (free)
- Qdrant Cloud account with API key (managed service)
- One of the following for embeddings:
- Ollama running locally (default, free)
- OpenAI API key (optional, paid)
Available Tools
1. add_document
Add a document to the RAG system.
Parameters:
url
(required): Document URL/identifiercontent
(required): Document contentmetadata
(optional): Document metadatatitle
: Document titlecontentType
: Content type (e.g., "text/markdown")
2. search_documents
Search through stored documents using semantic similarity.
Parameters:
query
(required): Natural language search queryoptions
(optional):limit
: Maximum number of results (1-20, default: 5)scoreThreshold
: Minimum similarity score (0-1, default: 0.7)filters
:domain
: Filter by domainhasCode
: Filter for documents containing codeafter
: Filter for documents after date (ISO format)before
: Filter for documents before date (ISO format)
3. list_documents
List all stored documents with pagination and grouping options.
Parameters (all optional):
page
: Page number (default: 1)pageSize
: Number of documents per page (1-100, default: 20)groupByDomain
: Group documents by domain (default: false)sortBy
: Sort field ("timestamp", "title", or "domain")sortOrder
: Sort order ("asc" or "desc")
4. delete_document
Delete a document from the RAG system.
Parameters:
url
(required): URL of the document to delete
Installation
npm install -g @mcpservers/ragdocs
MCP Server Configuration
{
"mcpServers": {
"ragdocs": {
"command": "node",
"args": ["@mcpservers/ragdocs"],
"env": {
"QDRANT_URL": "http://127.0.0.1:6333",
"EMBEDDING_PROVIDER": "ollama"
}
}
}
}
Using Qdrant Cloud:
{
"mcpServers": {
"ragdocs": {
"command": "node",
"args": ["@mcpservers/ragdocs"],
"env": {
"QDRANT_URL": "https://your-cluster-url.qdrant.tech",
"QDRANT_API_KEY": "your-qdrant-api-key",
"EMBEDDING_PROVIDER": "ollama"
}
}
}
}
Using OpenAI:
{
"mcpServers": {
"ragdocs": {
"command": "node",
"args": ["@mcpservers/ragdocs"],
"env": {
"QDRANT_URL": "http://127.0.0.1:6333",
"EMBEDDING_PROVIDER": "openai",
"OPENAI_API_KEY": "your-api-key"
}
}
}
}
Local Qdrant with Docker
docker run -d --name qdrant -p 6333:6333 -p 6334:6334 qdrant/qdrant
Environment Variables
QDRANT_URL
: URL of your Qdrant instance- For local: "http://127.0.0.1:6333" (default)
- For cloud: "https://your-cluster-url.qdrant.tech"
QDRANT_API_KEY
: API key for Qdrant Cloud (required when using cloud instance)EMBEDDING_PROVIDER
: Choice of embedding provider ("ollama" or "openai", default: "ollama")OPENAI_API_KEY
: OpenAI API key (required if using OpenAI)EMBEDDING_MODEL
: Model to use for embeddings- For Ollama: defaults to "nomic-embed-text"
- For OpenAI: defaults to "text-embedding-3-small"
License
Apache License 2.0
Recommended Servers
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Playwright MCP Server
Provides a server utilizing Model Context Protocol to enable human-like browser automation with Playwright, allowing control over browser actions such as navigation, element interaction, and scrolling.
Apple MCP Server
Enables interaction with Apple apps like Messages, Notes, and Contacts through the MCP protocol to send messages, search, and open app content using natural language.
contentful-mcp
Update, create, delete content, content-models and assets in your Contentful Space

Supabase MCP Server
A Model Context Protocol (MCP) server that provides programmatic access to the Supabase Management API. This server allows AI models and other clients to manage Supabase projects and organizations through a standardized interface.
serper-search-scrape-mcp-server
This Serper MCP Server supports search and webpage scraping, and all the most recent parameters introduced by the Serper API, like location.
The Verge News MCP Server
Provides tools to fetch and search news from The Verge's RSS feed, allowing users to get today's news, retrieve random articles from the past week, and search for specific keywords in recent Verge content.