MCP Conversation Server
A Model Context Protocol server implementation that provides a standardized interface for applications to interact with OpenRouter's language models through a unified conversation management system.
bsmi021
README
MCP Conversation Server
A Model Context Protocol (MCP) server implementation for managing conversations with OpenRouter's language models. This server provides a standardized interface for applications to interact with various language models through a unified conversation management system.
Features
-
MCP Protocol Support
- Full MCP protocol compliance
- Resource management and discovery
- Tool-based interaction model
- Streaming response support
- Error handling and recovery
-
OpenRouter Integration
- Support for all OpenRouter models
- Real-time streaming responses
- Automatic token counting
- Model context window management
- Available models include:
- Claude 3 Opus
- Claude 3 Sonnet
- Llama 2 70B
- And many more from OpenRouter's catalog
-
Conversation Management
- Create and manage multiple conversations
- Support for system messages
- Message history tracking
- Token usage monitoring
- Conversation filtering and search
-
Streaming Support
- Real-time message streaming
- Chunked response handling
- Token counting
-
File System Persistence
- Conversation state persistence
- Configurable storage location
- Automatic state management
Installation
npm install mcp-conversation-server
Configuration
Configuration
All configuration for the MCP Conversation Server is now provided via YAML. Please update the config/models.yaml
file with your settings. For example:
# MCP Server Configuration
openRouter:
apiKey: "YOUR_OPENROUTER_API_KEY" # Replace with your actual OpenRouter API key.
persistence:
path: "./conversations" # Directory for storing conversation data.
models:
# Define your models here
'provider/model-name':
id: 'provider/model-name'
contextWindow: 123456
streaming: true
temperature: 0.7
description: 'Model description'
# Default model to use if none specified
defaultModel: 'provider/model-name'
Server Configuration
The MCP Conversation Server now loads all its configuration from the YAML file. In your application, you can load the configuration as follows:
const config = await loadModelsConfig(); // Loads openRouter, persistence, models, and defaultModel settings from 'config/models.yaml'
Note: Environment variables are no longer required as all configuration is provided via the YAML file.
Usage
Basic Server Setup
import { ConversationServer } from 'mcp-conversation-server';
const server = new ConversationServer(config);
server.run().catch(console.error);
Available Tools
The server exposes several MCP tools:
-
create-conversation
{ provider: 'openrouter', // Provider is always 'openrouter' model: string, // OpenRouter model ID (e.g., 'anthropic/claude-3-opus-20240229') title?: string; // Optional conversation title }
-
send-message
{ conversationId: string; // Conversation ID content: string; // Message content stream?: boolean; // Enable streaming responses }
-
list-conversations
{ filter?: { model?: string; // Filter by model startDate?: string; // Filter by start date endDate?: string; // Filter by end date } }
Resources
The server provides access to several resources:
-
conversation://{id}
- Access specific conversation details
- View message history
- Check conversation metadata
-
conversation://list
- List all active conversations
- Filter conversations by criteria
- Sort by recent activity
Development
Building
npm run build
Running Tests
npm test
Debugging
The server provides several debugging features:
-
Error Logging
- All errors are logged with stack traces
- Token usage tracking
- Rate limit monitoring
-
MCP Inspector
npm run inspector
Use the MCP Inspector to:
- Test tool execution
- View resource contents
- Monitor message flow
- Validate protocol compliance
-
Provider Validation
await server.providerManager.validateProviders();
Validates:
- API key validity
- Model availability
- Rate limit status
Troubleshooting
Common issues and solutions:
-
OpenRouter Connection Issues
- Verify your API key is valid
- Check rate limits on OpenRouter's dashboard
- Ensure the model ID is correct
- Monitor credit usage
-
Message Streaming Errors
- Verify model streaming support
- Check connection stability
- Monitor token limits
- Handle timeout settings
-
File System Errors
- Check directory permissions
- Verify path configuration
- Monitor disk space
- Handle concurrent access
Contributing
- Fork the repository
- Create a feature branch
- Commit your changes
- Push to the branch
- Create a Pull Request
License
ISC License
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
MCP Package Docs Server
Facilitates LLMs to efficiently access and fetch structured documentation for packages in Go, Python, and NPM, enhancing software development with multi-language support and performance optimization.
Claude Code MCP
An implementation of Claude Code as a Model Context Protocol server that enables using Claude's software engineering capabilities (code generation, editing, reviewing, and file operations) through the standardized MCP interface.
@kazuph/mcp-taskmanager
Model Context Protocol server for Task Management. This allows Claude Desktop (or any MCP client) to manage and execute tasks in a queue-based system.
Linear MCP Server
Enables interaction with Linear's API for managing issues, teams, and projects programmatically through the Model Context Protocol.
mermaid-mcp-server
A Model Context Protocol (MCP) server that converts Mermaid diagrams to PNG images.
Jira-Context-MCP
MCP server to provide Jira Tickets information to AI coding agents like Cursor

Linear MCP Server
A Model Context Protocol server that integrates with Linear's issue tracking system, allowing LLMs to create, update, search, and comment on Linear issues through natural language interactions.

Sequential Thinking MCP Server
This server facilitates structured problem-solving by breaking down complex issues into sequential steps, supporting revisions, and enabling multiple solution paths through full MCP integration.