qune-tech/ocds-mcp
MCP server for German public procurement data (OCDS). Semantic search, tender matching, and company profiles — all from your local LLM
README
ocds-mcp
MCP server for German public procurement data (OCDS). Connects your AI assistant (Claude, GPT, etc.) to the Vergabe Dashboard API for semantic search, tender matching, and company profile management.
Your company profiles never leave your machine — only embedding vectors are sent to the API. GDPR-compliant by design.
Quick Start
1. Get an API key
Sign up at vergabe-dashboard.qune.de and create an API key (MCP or Enterprise plan required).
2. Install
Download pre-built binary from GitHub Releases:
| Platform | Download |
|---|---|
| Linux x86_64 | ocds-mcp-linux-x86_64.tar.gz |
| macOS Apple Silicon | ocds-mcp-macos-arm64.tar.gz |
| Windows x86_64 | ocds-mcp-windows-x86_64.zip |
Linux / macOS:
# Example for Linux x86_64 — adjust the filename for your platform
tar xzf ocds-mcp-linux-x86_64.tar.gz
sudo mv ocds-mcp-linux-x86_64 /usr/local/bin/ocds-mcp
Windows: Extract the zip and move ocds-mcp-windows-x86_64.exe somewhere on your PATH (e.g. C:\Users\YOU\.local\bin\ocds-mcp.exe).
Or build from source:
git clone https://github.com/qune-tech/ocds-mcp.git
cd ocds-mcp
cargo build --release
# Binary at target/release/ocds-mcp
3. Configure your AI client
Claude Desktop — edit claude_desktop_config.json:
{
"mcpServers": {
"ocds": {
"command": "ocds-mcp",
"args": ["--api-key", "sk_live_YOUR_KEY_HERE"]
}
}
}
Claude Code — add .mcp.json to your project root:
{
"mcpServers": {
"ocds": {
"command": "ocds-mcp",
"args": ["--api-key", "sk_live_YOUR_KEY_HERE"]
}
}
}
Cursor — Settings → MCP Servers → Add:
- Command:
ocds-mcp - Args:
--api-key sk_live_YOUR_KEY_HERE
LM Studio — Settings → MCP → Add Server:
- Click + Add Server and choose STDIO
- Fill in:
- Name:
ocds - Command: full path to the binary, e.g.
/usr/local/bin/ocds-mcp - Arguments:
--api-key sk_live_YOUR_KEY_HERE
- Name:
- Click Save
- In the chat, select a model that supports tool use and enable the
ocdsserver
LM Studio requires models with tool-calling support (e.g. Qwen 2.5, Mistral, Llama 3.1+). Smaller models may not use all 10 tools reliably — 7B+ recommended.
Replace sk_live_YOUR_KEY_HERE with your actual API key.
Available Tools
| Tool | Description |
|---|---|
search_text |
Semantic search across all tenders |
list_releases |
Filter and browse tenders by month, CPV code, category, value range |
get_release |
Full tender details by OCID |
get_index_info |
Database statistics and connectivity check |
create_company_profile |
Create a matching profile for your company |
update_company_profile |
Update an existing profile |
get_company_profile |
View profile details |
list_company_profiles |
List all your profiles |
delete_company_profile |
Delete a profile |
match_tenders |
Match a profile against all tenders with semantic similarity |
CLI Options
Usage: ocds-mcp [OPTIONS]
Options:
--db <DB> Local profiles database [default: profiles.db]
--data-dir <DIR> Data directory [default: data]
--api-url <URL> Vergabe Dashboard API [default: https://vergabe-dashboard.qune.de]
--api-key <KEY> API key [env: OCDS_API_KEY]
-h, --help Print help
How It Works
LLM ←stdio→ ocds-mcp (local)
│ Local: company profiles + sentence embedder
│ Remote: searches, release queries
└──HTTPS──→ Vergabe Dashboard API
The MCP server runs locally on your machine:
- Company profiles are stored in a local SQLite database — they never leave your network.
- Text embeddings are computed locally using a multilingual ONNX model (multilingual-e5-small, ~118 MB, auto-downloaded on first use).
- Only embedding vectors (arrays of 384 floats) are sent to the API for search and matching — your profile text stays local.
- Tender data is fetched from the API on demand.
Requirements
- An API key from vergabe-dashboard.qune.de (MCP or Enterprise plan)
- ~200 MB disk space for the ONNX model (downloaded automatically on first run)
- Internet connection to reach the API
License
MIT — see LICENSE.
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.