Brain Server - MCP Knowledge Embedding Service
patrickdeluca
README
Brain Server - MCP Knowledge Embedding Service
A powerful MCP (Model Context Protocol) server for managing knowledge embeddings and vector search.
Features
- Vector Embeddings: Generate high-quality embeddings for knowledge content
- Semantic Search: Find knowledge based on meaning, not just keywords
- MCP Compliance: Follows Model Context Protocol for AI integration
- Brain Management: Organize knowledge into domain-specific brains
- Context-Aware Retrieval: Includes surrounding context for better understanding
- Progress Tracking: Real-time monitoring of long-running operations
Embedding Models
The server uses embedding models to convert text into vector representations:
- On first run, the server will automatically download the embedding models
- By default, it uses
Xenova/all-MiniLM-L6-v2
from HuggingFace - Models are cached locally after first download
- For testing, a
MockEmbeddingProvider
is available that generates random vectors
You can configure which model to use in the .env
file:
EMBEDDING_MODEL=Xenova/all-MiniLM-L6-v2
Supported models include:
Xenova/all-MiniLM-L6-v2
(default, 384 dimensions)Xenova/bge-small-en-v1.5
(384 dimensions)Xenova/bge-base-en-v1.5
(768 dimensions)Xenova/e5-small-v2
(384 dimensions)
Quick Start with Docker
The easiest way to run Brain Server is using Docker and Docker Compose:
# Clone the repository
git clone https://github.com/patrickdeluca/mcp-brain-server.git
cd mcp-brain-server
# Start the server with Docker Compose
docker-compose up -d
# View logs
docker-compose logs -f
The server will be available at http://localhost:3000 with MongoDB running inside the same container.
Using Docker with Claude Desktop
To use the dockerized Brain Server with Claude Desktop, update your claude_desktop_config.json
:
{
"brain-server": {
"command": "docker",
"args": ["run", "--rm", "-p", "3000:3000", "patrickdeluca/mcp-brain-server:latest"],
"env": {}
}
}
Prerequisites for Local Installation
MongoDB Installation
If you're not using Docker, the Brain Server requires MongoDB (version 6.0 or later recommended for vector search):
Modern Installation (Recommended)
Ubuntu/Debian
# Import public key
wget -qO - https://www.mongodb.org/static/pgp/server-6.0.asc | sudo tee /etc/apt/trusted.gpg.d/mongodb-6.0.asc
# Add MongoDB repository
echo "deb [ arch=amd64,arm64 ] https://repo.mongodb.org/apt/ubuntu focal/mongodb-org/6.0 multiverse" | sudo tee /etc/apt/sources.list.d/mongodb-org-6.0.list
# Update package database and install MongoDB
sudo apt-get update
sudo apt-get install -y mongodb-org
# Start MongoDB service
sudo systemctl start mongod
macOS
# Using Homebrew
brew tap mongodb/brew
brew install mongodb-community@6.0
brew services start mongodb-community@6.0
Windows
- Download the MongoDB 6.0 installer from the MongoDB Download Center
- Run the installer and follow the setup wizard
- Start MongoDB from the Windows Services console
Verify MongoDB Installation
To verify that MongoDB is running properly:
mongosh --eval "db.version()"
Manual Installation
# Clone the repository
git clone https://github.com/patrickdeluca/mcp-brain-server.git
cd mcp-brain-server
# Install dependencies
npm install
# Configure environment
cp .env.example .env
# Edit .env with your settings
# Build the project
npm run build
Configuration
Configure the server using environment variables in the .env
file:
# Server Configuration
PORT=3000
# MongoDB Configuration
MONGODB_URI=mongodb://localhost:27017/brain_db
# Model Configuration
EMBEDDING_MODEL=Xenova/all-MiniLM-L6-v2
MAX_CHUNK_SIZE=1024
Usage
Starting the Server
# Development mode
npm run dev
# Production mode
npm start
Using with Claude Desktop
Add the brain server to your Claude Desktop configuration by adding this to your claude_desktop_config.json
file:
{
"brain-server": {
"command": "node",
"args": ["path/to/mcp-brain-server/dist/index.js"],
"env": {
"MONGODB_URI": "mongodb://localhost:27017/brain_db"
}
}
}
Using with MCP Inspector
To debug or test the server, you can use the MCP Inspector:
npx @modelcontextprotocol/inspector node dist/index.js
Building Your Own Docker Image
If you want to build and run your own Docker image:
# Build the Docker image
docker build -t mcp-brain-server .
# Run the container
docker run -p 3000:3000 -d --name brain-server mcp-brain-server
The Docker image includes both the Brain Server and MongoDB for a self-contained deployment.
MCP Resources
The server exposes the following MCP resources:
embedding_config
: Current embedding configurationembedding_models
: Available embedding models and their configurationsservice_status
: Current status of the embedding service
MCP Tool Usage
The server exposes the following MCP tools:
addKnowledge
: Add new knowledge to the vector databasesearchSimilar
: Find semantically similar contentupdateKnowledge
: Update existing knowledge entriesdeleteKnowledge
: Remove knowledge entriesbatchAddKnowledge
: Add multiple knowledge entries in a batchgetEmbedding
: Generate embeddings for text content
Development
Project Structure
src/
├── config/ # Configuration settings
├── controllers/ # Route controllers
├── errors/ # Error definitions
├── middleware/ # Express middleware
├── models/ # Data models and types
├── services/ # Business logic
│ ├── embeddings/ # Embedding providers
│ ├── ingestion/ # Knowledge ingestion
│ ├── processing/ # Knowledge processing
│ └── storage/ # Storage services
├── tools/ # MCP tool definitions
├── types/ # TypeScript type definitions
├── utils/ # Utility functions
├── server.ts # MCP server setup
└── index.ts # Application entry point
Tool Schema Examples
Here's an example of using the addKnowledge
tool:
{
"content": "The Model Context Protocol (MCP) is a standardized interface for AI models to interact with external systems.",
"metadata": {
"brainId": "tech-knowledge",
"userId": "user123",
"source": "documentation",
"type": "definition"
}
}
And the searchSimilar
tool:
{
"query": "What is MCP?",
"options": {
"limit": 5,
"minConfidence": 0.7,
"filters": {
"metadata.brainId": "tech-knowledge"
}
}
}
Troubleshooting
Docker Issues
- Check container logs:
docker logs brain-server
- Ensure ports are correctly mapped:
docker ps
- Verify MongoDB is running in the container:
docker exec brain-server ps aux | grep mongod
MongoDB Connection Issues
- Verify MongoDB is running:
ps aux | grep mongod
- Check MongoDB logs:
sudo cat /var/log/mongodb/mongod.log
- Ensure your firewall allows connections to MongoDB (default port 27017)
- Verify your connection string in
.env
:MONGODB_URI=mongodb://localhost:27017/brain_db
Missing Vector Index Capabilities
If you encounter errors related to vector index capabilities:
- Ensure you're using MongoDB 6.0+ for optimal vector search support
- For older MongoDB versions, the server will fall back to approximate nearest neighbors search
Available Scripts
npm run build
: Build the TypeScript projectnpm start
: Run the built applicationnpm run dev
: Run in development mode with hot reloadingnpm test
: Run testsnpm run lint
: Run linter
License
This project is licensed under the ISC License - see the LICENSE file for details.
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
Recommended Servers
Crypto Price & Market Analysis MCP Server
A Model Context Protocol (MCP) server that provides comprehensive cryptocurrency analysis using the CoinCap API. This server offers real-time price data, market analysis, and historical trends through an easy-to-use interface.
MCP PubMed Search
Server to search PubMed (PubMed is a free, online database that allows users to search for biomedical and life sciences literature). I have created on a day MCP came out but was on vacation, I saw someone post similar server in your DB, but figured to post mine.
dbt Semantic Layer MCP Server
A server that enables querying the dbt Semantic Layer through natural language conversations with Claude Desktop and other AI assistants, allowing users to discover metrics, create queries, analyze data, and visualize results.
mixpanel
Connect to your Mixpanel data. Query events, retention, and funnel data from Mixpanel analytics.

Sequential Thinking MCP Server
This server facilitates structured problem-solving by breaking down complex issues into sequential steps, supporting revisions, and enabling multiple solution paths through full MCP integration.

Nefino MCP Server
Provides large language models with access to news and information about renewable energy projects in Germany, allowing filtering by location, topic (solar, wind, hydrogen), and date range.
Vectorize
Vectorize MCP server for advanced retrieval, Private Deep Research, Anything-to-Markdown file extraction and text chunking.
Mathematica Documentation MCP server
A server that provides access to Mathematica documentation through FastMCP, enabling users to retrieve function documentation and list package symbols from Wolfram Mathematica.
kb-mcp-server
An MCP server aimed to be portable, local, easy and convenient to support semantic/graph based retrieval of txtai "all in one" embeddings database. Any txtai embeddings db in tar.gz form can be loaded
Research MCP Server
The server functions as an MCP server to interact with Notion for retrieving and creating survey data, integrating with the Claude Desktop Client for conducting and reviewing surveys.