Self-hosted LLM MCP Server
Enables interaction with self-hosted LLM models via Ollama and Supabase database operations. Supports text generation, SQL queries, and data storage/retrieval through natural language commands.
README
MCP Server with Self-hosted LLM and Supabase Integration
A comprehensive Model Context Protocol (MCP) server that integrates with self-hosted LLM models via Ollama and Supabase database for data persistence and retrieval.
Features
- MCP Protocol Support: Full implementation of the Model Context Protocol specification
- Self-hosted LLM Integration: Support for Ollama-based LLM models (Llama2, CodeLlama, etc.)
- Supabase Database Integration: Complete CRUD operations with Supabase
- Docker Support: Containerized deployment with Docker Compose
- Comprehensive Testing: Unit tests with ≥90% coverage, integration tests, and E2E tests
- TypeScript: Fully typed implementation for better development experience
- Logging: Structured logging with configurable levels and formats
Architecture
┌─────────────────┐ ┌─────────────────┐ ┌─────────────────┐
│ MCP Client │ │ MCP Server │ │ Supabase DB │
│ │◄──►│ │◄──►│ │
└─────────────────┘ └─────────────────┘ └─────────────────┘
│
▼
┌─────────────────┐
│ Ollama LLM │
│ (Self-hosted) │
└─────────────────┘
Quick Start
Prerequisites
- Docker and Docker Compose
- Node.js 18+ (for local development)
- Supabase account and project
1. Clone and Setup
git clone <repository-url>
cd mcp-server-selfhosted
cp env.example .env
2. Configure Environment
Edit .env file with your configuration:
# Supabase Configuration
SUPABASE_URL=your_supabase_url_here
SUPABASE_ANON_KEY=your_supabase_anon_key_here
SUPABASE_SERVICE_ROLE_KEY=your_supabase_service_role_key_here
# Self-hosted LLM Configuration
LLM_BASE_URL=http://localhost:11434
LLM_MODEL=llama2
LLM_TIMEOUT=30000
# MCP Server Configuration
MCP_SERVER_PORT=3000
MCP_SERVER_HOST=localhost
# Logging
LOG_LEVEL=info
LOG_FORMAT=json
3. Start with Docker Compose
docker-compose up -d
This will start:
- Ollama service (self-hosted LLM)
- MCP Server
- Health checks and monitoring
4. Verify Installation
# Check if services are running
docker-compose ps
# Test MCP server health
curl http://localhost:3000/health
# Test Ollama connection
curl http://localhost:11434/api/tags
5. Test Build Locally (Optional)
# Test TypeScript compilation
npm run build
# Test HTTP server
npm run start:http
# Test health endpoint
curl http://localhost:3000/health
Available Tools
The MCP server provides the following tools:
1. query_database
Execute SQL queries on the Supabase database.
Parameters:
query(string, required): SQL query to executetable(string, optional): Table name for context
Example:
{
"name": "query_database",
"arguments": {
"query": "SELECT * FROM users WHERE active = true",
"table": "users"
}
}
2. generate_text
Generate text using the self-hosted LLM.
Parameters:
prompt(string, required): Text prompt for the LLMmaxTokens(number, optional): Maximum tokens to generatetemperature(number, optional): Temperature for generation (0.0-1.0)
Example:
{
"name": "generate_text",
"arguments": {
"prompt": "Explain quantum computing in simple terms",
"maxTokens": 500,
"temperature": 0.7
}
}
3. store_data
Store data in the Supabase database.
Parameters:
table(string, required): Table name to store datadata(object, required): Data to store
Example:
{
"name": "store_data",
"arguments": {
"table": "documents",
"data": {
"title": "My Document",
"content": "Document content here",
"author": "John Doe"
}
}
}
4. retrieve_data
Retrieve data from the Supabase database.
Parameters:
table(string, required): Table name to retrieve data fromfilters(object, optional): Filters to applylimit(number, optional): Maximum number of records to retrieve
Example:
{
"name": "retrieve_data",
"arguments": {
"table": "documents",
"filters": {
"author": "John Doe"
},
"limit": 10
}
}
Development
Local Development Setup
- Install Dependencies:
npm install
- Start Ollama (if not using Docker):
# Install Ollama
curl -fsSL https://ollama.ai/install.sh | sh
# Pull a model
ollama pull llama2
# Start Ollama
ollama serve
- Start Supabase (if using local instance):
# Install Supabase CLI
npm install -g supabase
# Start local Supabase
supabase start
- Run Development Server:
npm run dev
Testing
The project includes comprehensive testing:
# Run unit tests
npm test
# Run tests with coverage
npm run test:coverage
# Run E2E tests
npm run test:e2e
# Run all tests
npm run test && npm run test:e2e
Code Quality
# Lint code
npm run lint
# Fix linting issues
npm run lint:fix
Docker Configuration
Dockerfile
The Dockerfile creates an optimized production image:
- Node.js 18 Alpine base
- Non-root user for security
- Health checks
- Multi-stage build for smaller image size
Docker Compose
The docker-compose.yml orchestrates:
- Ollama service for LLM
- MCP Server
- Health checks and dependencies
- Volume persistence for Ollama models
Security Considerations
- SQL Injection Protection: Basic sanitization of SQL queries
- Environment Variables: Sensitive data stored in environment variables
- Non-root Container: Docker containers run as non-root user
- Input Validation: Zod schemas for input validation
- Error Handling: Comprehensive error handling without information leakage
Monitoring and Logging
Log Levels
DEBUG: Detailed debugging informationINFO: General information messagesWARN: Warning messagesERROR: Error messages
Log Formats
text: Human-readable formatjson: Structured JSON format for log aggregation
Health Checks
- HTTP endpoint:
GET /health - Docker health checks
- Service dependency checks
Troubleshooting
Common Issues
-
Ollama Connection Failed
# Check if Ollama is running curl http://localhost:11434/api/tags # Restart Ollama service docker-compose restart ollama -
Supabase Connection Failed
# Verify environment variables echo $SUPABASE_URL echo $SUPABASE_ANON_KEY # Test connection curl -H "Authorization: Bearer $SUPABASE_ANON_KEY" $SUPABASE_URL/rest/v1/ -
MCP Server Not Starting
# Check logs docker-compose logs mcp-server # Check health curl http://localhost:3000/health -
Docker Build Fails with "tsc: not found"
# This is fixed in the current Dockerfile # The issue was NODE_ENV=production preventing dev dependencies installation # Solution: Set NODE_ENV=development during build phase # If you still encounter issues, try: docker-compose build --no-cache -
TypeScript Compilation Errors
# Test build locally first npm run build # Check for missing dependencies npm install # Clear node_modules and reinstall rm -rf node_modules package-lock.json npm install
Performance Optimization
-
LLM Performance
- Use GPU-enabled Ollama for better performance
- Adjust model parameters (temperature, max_tokens)
- Consider model size vs. quality trade-offs
-
Database Performance
- Use connection pooling
- Optimize SQL queries
- Consider indexing strategies
Contributing
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests for new functionality
- Ensure all tests pass
- Submit a pull request
License
MIT License - see LICENSE file for details.
Support
For issues and questions:
- Create an issue in the repository
- Check the troubleshooting section
- Review the test cases for usage examples
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.