MCP Server with LLM Integration
Enables chat with multiple LLM providers (OpenAI and Anthropic) while maintaining persistent conversation memory. Provides extensible tool framework for various operations including echo functionality and conversation storage/retrieval.
README
MCP Server
A Model Context Protocol (MCP) server implementation with LLM integration and chat memory capabilities.
Features
- MCP Server: Full Model Context Protocol server implementation
- LLM Integration: Support for OpenAI and Anthropic models
- Chat Memory: Persistent conversation storage and retrieval
- Tool System: Extensible tool framework for various operations
Installation
- Clone this repository:
git clone <repository-url>
cd MCP
- Install dependencies:
pip install -r requirements.txt
Or using the development environment:
pip install -e .[dev]
Configuration
Set up your API keys as environment variables:
export OPENAI_API_KEY="your-openai-api-key"
export ANTHROPIC_API_KEY="your-anthropic-api-key"
Or create a .env file:
OPENAI_API_KEY=your-openai-api-key
ANTHROPIC_API_KEY=your-anthropic-api-key
Usage
Running the MCP Server
Start the server using the command line:
python -m mcp
Or run directly:
python mcp.py
Available Tools
The server provides the following tools:
Echo Tool
Simple echo functionality for testing.
{
"name": "echo",
"arguments": {
"text": "Hello, world!"
}
}
Chat Memory Tools
Store Memory
{
"name": "store_memory",
"arguments": {
"conversation_id": "conv_123",
"content": "User preferences: dark mode enabled",
"metadata": {"type": "preference"}
}
}
Get Memory
{
"name": "get_memory",
"arguments": {
"conversation_id": "conv_123"
}
}
LLM Chat Tool
{
"name": "llm_chat",
"arguments": {
"message": "What is the capital of France?",
"model": "gpt-3.5-turbo"
}
}
Supported Models
OpenAI Models:
- gpt-3.5-turbo
- gpt-4
- gpt-4-turbo
- gpt-4o
Anthropic Models:
- claude-3-haiku-20240307
- claude-3-sonnet-20240229
- claude-3-opus-20240229
Development
Running Tests
pytest
Code Formatting
black .
isort .
Type Checking
mypy .
Architecture
Components
- mcp.py: Main MCP server implementation and tool registration
- llmintegrationsystem.py: LLM provider integration and chat completions
- chatmemorysystem.py: Persistent conversation storage with SQLite
Database Schema
The chat memory system uses SQLite with two main tables:
memories: Individual conversation messages and metadataconversation_summaries: Conversation overviews and statistics
Contributing
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests for new functionality
- Submit a pull request
License
MIT License - see LICENSE file for details.
Troubleshooting
Common Issues
API Key Errors Ensure your API keys are properly set in environment variables.
Database Permissions
The server creates a chat_memory.db file in the current directory. Ensure write permissions.
Port Conflicts The MCP server uses stdio communication by default. No port configuration needed.
Logging
Enable debug logging:
PYTHONPATH=. python -c "import logging; logging.basicConfig(level=logging.DEBUG); import mcp; mcp.main()"
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.