
Biel AI
Let AI tools like Cursor, VS Code, or Claude Desktop answer questions using your product docs. Biel.ai provides the RAG system and MCP server.
README
<div align="center"> <picture> <source media="(prefers-color-scheme: dark)" srcset="./logo-dark..jpg" /> <img alt="Biel.ai" src="./logo.jpg" /> </picture> <h1>Biel.ai MCP Server</h1> <h3>Connect your IDE to your product docs</h3> </div>
Give AI tools like Cursor, VS Code, and Claude Desktop access to your company's product knowledge through the Biel.ai platform.
Biel.ai provides a hosted Retrieval-Augmented Generation (RAG) layer that makes your documentation searchable and useful to AI tools. This enables smarter completions, accurate technical answers, and context-aware suggestions—directly in your IDE or chat environment.
When AI tools can read your product documentation, they become significantly more helpful—generating more accurate code completions, answering technical questions with context, and guiding developers with real-time product knowledge.
Note: Requires a Biel.ai account and project setup. Start your free 15-day trial.
<h3><a href="https://docs.biel.ai/integrations/mcp-server?utm_source=github&utm_medium=referral&utm_campaign=readme">See quickstart instructions →</a></h3>
Getting started
1. Get your MCP configuration
{
"mcpServers": {
"biel-ai": {
"description": "Query your product's documentation, APIs, and knowledge base.",
"command": "npx",
"args": [
"mcp-remote",
"https://mcp.biel.ai/sse?project_slug=YOUR_PROJECT_SLUG&domain=https://your-docs-domain.com"
]
}
}
}
Required: project_slug
and domain
Optional: api_key
(only needed for private projects)
2. Add to your AI tool
- Cursor: Settings → *Tools & Integrations → New MCP server.
- Claude Desktop: Edit
claude_desktop_config.json
- VS Code: Install MCP extension.
3. Start asking questions
Can you check in biel_ai what the auth headers are for the /users endpoint?
Self-hosting (Optional)
For advanced users who prefer to run their own MCP server instance:
Local development
# Clone and run locally
git clone https://github.com/techdocsStudio/biel-mcp
cd biel-mcp
pip install -r requirements.txt
python biel_mcp_server.py
Docker deployment
# Docker Compose (recommended)
docker-compose up -d --build
# Or Docker directly
docker build -t biel-mcp .
docker run -d -p 7832:7832 biel-mcp
Support
- Issues: GitHub Issues
- Contact: support@biel.ai
- Custom Demo: Book a demo
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.