RagDocs MCP Server

RagDocs MCP Server

Provides RAG capabilities for semantic document search using Qdrant vector database and Ollama/OpenAI embeddings, allowing users to add, search, list, and delete documentation with metadata support.

heltonteixeira

Knowledge & Memory
Search
Databases
TypeScript
Visit Server

README

RagDocs MCP Server

A Model Context Protocol (MCP) server that provides RAG (Retrieval-Augmented Generation) capabilities using Qdrant vector database and Ollama/OpenAI embeddings. This server enables semantic search and management of documentation through vector similarity.

Features

  • Add documentation with metadata
  • Semantic search through documents
  • List and organize documentation
  • Delete documents
  • Support for both Ollama (free) and OpenAI (paid) embeddings
  • Automatic text chunking and embedding generation
  • Vector storage with Qdrant

Prerequisites

  • Node.js 16 or higher
  • One of the following Qdrant setups:
    • Local instance using Docker (free)
    • Qdrant Cloud account with API key (managed service)
  • One of the following for embeddings:
    • Ollama running locally (default, free)
    • OpenAI API key (optional, paid)

Available Tools

1. add_document

Add a document to the RAG system.

Parameters:

  • url (required): Document URL/identifier
  • content (required): Document content
  • metadata (optional): Document metadata
    • title: Document title
    • contentType: Content type (e.g., "text/markdown")

2. search_documents

Search through stored documents using semantic similarity.

Parameters:

  • query (required): Natural language search query
  • options (optional):
    • limit: Maximum number of results (1-20, default: 5)
    • scoreThreshold: Minimum similarity score (0-1, default: 0.7)
    • filters:
      • domain: Filter by domain
      • hasCode: Filter for documents containing code
      • after: Filter for documents after date (ISO format)
      • before: Filter for documents before date (ISO format)

3. list_documents

List all stored documents with pagination and grouping options.

Parameters (all optional):

  • page: Page number (default: 1)
  • pageSize: Number of documents per page (1-100, default: 20)
  • groupByDomain: Group documents by domain (default: false)
  • sortBy: Sort field ("timestamp", "title", or "domain")
  • sortOrder: Sort order ("asc" or "desc")

4. delete_document

Delete a document from the RAG system.

Parameters:

  • url (required): URL of the document to delete

Installation

npm install -g @mcpservers/ragdocs

MCP Server Configuration

{
  "mcpServers": {
    "ragdocs": {
      "command": "node",
      "args": ["@mcpservers/ragdocs"],
      "env": {
        "QDRANT_URL": "http://127.0.0.1:6333",
        "EMBEDDING_PROVIDER": "ollama"
      }
    }
  }
}

Using Qdrant Cloud:

{
  "mcpServers": {
    "ragdocs": {
      "command": "node",
      "args": ["@mcpservers/ragdocs"],
      "env": {
        "QDRANT_URL": "https://your-cluster-url.qdrant.tech",
        "QDRANT_API_KEY": "your-qdrant-api-key",
        "EMBEDDING_PROVIDER": "ollama"
      }
    }
  }
}

Using OpenAI:

{
  "mcpServers": {
    "ragdocs": {
      "command": "node",
      "args": ["@mcpservers/ragdocs"],
      "env": {
        "QDRANT_URL": "http://127.0.0.1:6333",
        "EMBEDDING_PROVIDER": "openai",
        "OPENAI_API_KEY": "your-api-key"
      }
    }
  }
}

Local Qdrant with Docker

docker run -d --name qdrant -p 6333:6333 -p 6334:6334 qdrant/qdrant

Environment Variables

  • QDRANT_URL: URL of your Qdrant instance
    • For local: "http://127.0.0.1:6333" (default)
    • For cloud: "https://your-cluster-url.qdrant.tech"
  • QDRANT_API_KEY: API key for Qdrant Cloud (required when using cloud instance)
  • EMBEDDING_PROVIDER: Choice of embedding provider ("ollama" or "openai", default: "ollama")
  • OPENAI_API_KEY: OpenAI API key (required if using OpenAI)
  • EMBEDDING_MODEL: Model to use for embeddings
    • For Ollama: defaults to "nomic-embed-text"
    • For OpenAI: defaults to "text-embedding-3-small"

License

Apache License 2.0

Recommended Servers

Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
Apple MCP Server

Apple MCP Server

Enables interaction with Apple apps like Messages, Notes, and Contacts through the MCP protocol to send messages, search, and open app content using natural language.

Featured
Local
TypeScript
Playwright MCP Server

Playwright MCP Server

Provides a server utilizing Model Context Protocol to enable human-like browser automation with Playwright, allowing control over browser actions such as navigation, element interaction, and scrolling.

Featured
Local
TypeScript
dbt Semantic Layer MCP Server

dbt Semantic Layer MCP Server

A server that enables querying the dbt Semantic Layer through natural language conversations with Claude Desktop and other AI assistants, allowing users to discover metrics, create queries, analyze data, and visualize results.

Featured
TypeScript
Tavily MCP Server

Tavily MCP Server

Provides AI-powered web search capabilities using Tavily's search API, enabling LLMs to perform sophisticated web searches, get direct answers to questions, and search recent news articles.

Featured
Python
mixpanel

mixpanel

Connect to your Mixpanel data. Query events, retention, and funnel data from Mixpanel analytics.

Featured
TypeScript
Metabase MCP Server

Metabase MCP Server

Enables AI assistants to interact with Metabase databases and dashboards, allowing users to list and execute queries, access data visualizations, and interact with database resources through natural language.

Featured
JavaScript