qdrant-mcp

qdrant-mcp

MCP server for document ingestion and semantic search on Qdrant. Enables ingesting local documents, generating embeddings with OpenAI, and performing vector search with metadata filters.

Category
Visit Server

README

qdrant-mcp

MCP server for document ingestion and semantic search on Qdrant.

Overview

qdrant-mcp provides tools to:

  • ingest local documents into a Qdrant collection
  • generate embeddings with OpenAI
  • run vector search with optional metadata filters

Features

  • ingest_documents
    • converts files such as docx, pptx, and pdf to Markdown via MarkItDown
    • splits content into chunks using chunk_size and overlap_ratio
    • embeds chunks with OpenAI Embeddings (text-embedding-3-small by default)
    • upserts chunk text and metadata into Qdrant
  • search_documents
    • embeds query text with the same embeddings API
    • retrieves top k matches from Qdrant
    • supports filtering by category and path

Requirements

  • Python 3.11+
  • uv
  • Qdrant (for example, http://localhost:6333)
  • OPENAI_API_KEY

Setup

uv sync

Run inside the Codex CLI

[mcp_servers.qdrant-mcp]
command = "uv"
args = ["run", "qdrant-mcp"]
cwd = "/sandbox/qdrant-mcp"

env = {
  OPENAI_API_KEY = "sk-...",
  QDRANT_URL = "http://127.0.0.1:6333",
  QDRANT_API_KEY = "QDRANT_API_KEY",
  QDRANT_COLLECTION = "codex_collection",
  CHUNK_HEADER_MODEL = "gpt-5.4-mini"
}

Testing

Set OPENAI_API_KEY, QDRANT_URL, and QDRANT_API_KEY in .env, then run:

uv run python -m unittest tests/integration/test_qdrant_integration.py

MCP Tools

ingest_documents

Parameters:

  • paths: list[str]
  • category: str
  • chunk_size: int = 1200
  • overlap_ratio: float = 0.15
  • embedding_model: str = "text-embedding-3-small"
  • chunk_header_mode: Literal["enabled", "disabled"] = "enabled"

Returns:

  • collection
  • embedding_model
  • ingested_files
  • ingested_points
  • failed_files

search_documents

Parameters:

  • query: str
  • top_k: int = 5
  • category: str | None = None
  • path: str | None = None
  • embedding_model: str = "text-embedding-3-small"

Returns:

  • collection
  • embedding_model
  • query
  • count
  • results (score, path, category, chunk_index, text)

delete_documents_by_path

Parameters:

  • path: str
  • category: str | None = None

Returns:

  • collection
  • path
  • category
  • status
  • operation_id

list_category

Parameters:

  • limit: int = 100

Returns:

  • collection
  • count
  • categories

list_path

Parameters:

  • category: str
  • limit: int = 1000

Returns:

  • collection
  • category
  • count
  • paths

Notes

  • If the target collection does not exist, it is created automatically on first ingestion.
  • If payload indexes for category and path do not exist, they are created during ingestion.
  • By default, ingestion prepends a generated Chunk-Header (max 64 chars), derived from the first 4096 bytes, to every chunk.
  • The Chunk-Header model is read from CHUNK_HEADER_MODEL when chunk_header_mode is enabled (default: gpt-5.4-mini).
  • The collection name is configured only through QDRANT_COLLECTION (not via MCP tool parameters).
  • With text-embedding-3-small, the vector size is 1536.

License

See LICENSE.

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
E2B

E2B

Using MCP to run code via e2b.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured