MCP Server with LLM Integration

MCP Server with LLM Integration

Enables chat with multiple LLM providers (OpenAI and Anthropic) while maintaining persistent conversation memory. Provides extensible tool framework for various operations including echo functionality and conversation storage/retrieval.

Category
Visit Server

README

MCP Server

A Model Context Protocol (MCP) server implementation with LLM integration and chat memory capabilities.

Features

  • MCP Server: Full Model Context Protocol server implementation
  • LLM Integration: Support for OpenAI and Anthropic models
  • Chat Memory: Persistent conversation storage and retrieval
  • Tool System: Extensible tool framework for various operations

Installation

  1. Clone this repository:
git clone <repository-url>
cd MCP
  1. Install dependencies:
pip install -r requirements.txt

Or using the development environment:

pip install -e .[dev]

Configuration

Set up your API keys as environment variables:

export OPENAI_API_KEY="your-openai-api-key"
export ANTHROPIC_API_KEY="your-anthropic-api-key"

Or create a .env file:

OPENAI_API_KEY=your-openai-api-key
ANTHROPIC_API_KEY=your-anthropic-api-key

Usage

Running the MCP Server

Start the server using the command line:

python -m mcp

Or run directly:

python mcp.py

Available Tools

The server provides the following tools:

Echo Tool

Simple echo functionality for testing.

{
  "name": "echo",
  "arguments": {
    "text": "Hello, world!"
  }
}

Chat Memory Tools

Store Memory

{
  "name": "store_memory",
  "arguments": {
    "conversation_id": "conv_123",
    "content": "User preferences: dark mode enabled",
    "metadata": {"type": "preference"}
  }
}

Get Memory

{
  "name": "get_memory",
  "arguments": {
    "conversation_id": "conv_123"
  }
}

LLM Chat Tool

{
  "name": "llm_chat",
  "arguments": {
    "message": "What is the capital of France?",
    "model": "gpt-3.5-turbo"
  }
}

Supported Models

OpenAI Models:

  • gpt-3.5-turbo
  • gpt-4
  • gpt-4-turbo
  • gpt-4o

Anthropic Models:

  • claude-3-haiku-20240307
  • claude-3-sonnet-20240229
  • claude-3-opus-20240229

Development

Running Tests

pytest

Code Formatting

black .
isort .

Type Checking

mypy .

Architecture

Components

  • mcp.py: Main MCP server implementation and tool registration
  • llmintegrationsystem.py: LLM provider integration and chat completions
  • chatmemorysystem.py: Persistent conversation storage with SQLite

Database Schema

The chat memory system uses SQLite with two main tables:

  • memories: Individual conversation messages and metadata
  • conversation_summaries: Conversation overviews and statistics

Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes
  4. Add tests for new functionality
  5. Submit a pull request

License

MIT License - see LICENSE file for details.

Troubleshooting

Common Issues

API Key Errors Ensure your API keys are properly set in environment variables.

Database Permissions The server creates a chat_memory.db file in the current directory. Ensure write permissions.

Port Conflicts The MCP server uses stdio communication by default. No port configuration needed.

Logging

Enable debug logging:

PYTHONPATH=. python -c "import logging; logging.basicConfig(level=logging.DEBUG); import mcp; mcp.main()"

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
E2B

E2B

Using MCP to run code via e2b.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured