MCP Server
Mirror of
MCP-Mirror
README
MCP Server
Project Overview
Built on FastAPI and MCP (Model Context Protocol), this project enables standardized context interaction between AI models and development environments. It enhances the scalability and maintainability of AI applications by simplifying model deployment, providing efficient API endpoints, and ensuring consistency in model input and output, making it easier for developers to integrate and manage AI tasks.
MCP (Model Context Protocol) is a unified protocol for context interaction between AI models and development environments. This project provides a Python-based MCP server implementation that supports basic MCP protocol features, including initialization, sampling, and session management.
Features
- JSON-RPC 2.0: Request-response communication based on standard JSON-RPC 2.0 protocol
- SSE Connection: Support for Server-Sent Events connections for real-time notifications
- Modular Design: Modular architecture for easy extension and customization
- Asynchronous Processing: High-performance service using FastAPI and asynchronous IO
- Complete Client: Includes a full test client implementation
Project Structure
mcp_server/
├── mcp_server.py # MCP server main program
├── mcp_client.py # MCP client test program
├── routers/
│ ├── __init__.py # Router package initialization
│ └── base_router.py # Base router implementation
├── requirements.txt # Project dependencies
└── README.md # Project documentation
Installation
- Clone the repository:
git clone https://github.com/freedanfan/mcp_server.git
cd mcp_server
- Install dependencies:
pip install -r requirements.txt
Usage
Starting the Server
python mcp_server.py
By default, the server will start on 127.0.0.1:12000
. You can customize the host and port using environment variables:
export MCP_SERVER_HOST=0.0.0.0
export MCP_SERVER_PORT=8000
python mcp_server.py
Running the Client
Run the client in another terminal:
python mcp_client.py
If the server is not running at the default address, you can set an environment variable:
export MCP_SERVER_URL="http://your-server-address:port"
python mcp_client.py
API Endpoints
The server provides the following API endpoints:
- Root Path (
/
): Provides server information - API Endpoint (
/api
): Handles JSON-RPC requests - SSE Endpoint (
/sse
): Handles SSE connections
MCP Protocol Implementation
Initialization Flow
- Client connects to the server via SSE
- Server returns the API endpoint URI
- Client sends an initialization request with protocol version and capabilities
- Server responds to the initialization request, returning server capabilities
Sampling Request
Clients can send sampling requests with prompts:
{
"jsonrpc": "2.0",
"id": "request-id",
"method": "sample",
"params": {
"prompt": "Hello, please introduce yourself."
}
}
The server will return sampling results:
{
"jsonrpc": "2.0",
"id": "request-id",
"result": {
"content": "This is a response to the prompt...",
"usage": {
"prompt_tokens": 10,
"completion_tokens": 50,
"total_tokens": 60
}
}
}
Closing a Session
Clients can send a shutdown request:
{
"jsonrpc": "2.0",
"id": "request-id",
"method": "shutdown",
"params": {}
}
The server will gracefully shut down:
{
"jsonrpc": "2.0",
"id": "request-id",
"result": {
"status": "shutting_down"
}
}
Development Extensions
Adding New Methods
To add new MCP methods, add a handler function to the MCPServer
class and register it in the _register_methods
method:
def handle_new_method(self, params: dict) -> dict:
"""Handle new method"""
logger.info(f"Received new method request: {params}")
# Processing logic
return {"result": "success"}
def _register_methods(self):
# Register existing methods
self.router.register_method("initialize", self.handle_initialize)
self.router.register_method("sample", self.handle_sample)
self.router.register_method("shutdown", self.handle_shutdown)
# Register new method
self.router.register_method("new_method", self.handle_new_method)
Integrating AI Models
To integrate actual AI models, modify the handle_sample
method:
async def handle_sample(self, params: dict) -> dict:
"""Handle sampling request"""
logger.info(f"Received sampling request: {params}")
# Get prompt
prompt = params.get("prompt", "")
# Call AI model API
# For example: using OpenAI API
response = await openai.ChatCompletion.acreate(
model="gpt-4",
messages=[{"role": "user", "content": prompt}]
)
content = response.choices[0].message.content
usage = response.usage
return {
"content": content,
"usage": {
"prompt_tokens": usage.prompt_tokens,
"completion_tokens": usage.completion_tokens,
"total_tokens": usage.total_tokens
}
}
Troubleshooting
Common Issues
- Connection Errors: Ensure the server is running and the client is using the correct server URL
- 405 Method Not Allowed: Ensure the client is sending requests to the correct API endpoint
- SSE Connection Failure: Check network connections and firewall settings
Logging
Both server and client provide detailed logging. View logs for more information:
# Increase log level
export PYTHONPATH=.
python -m logging -v DEBUG -m mcp_server
References
License
This project is licensed under the MIT License. See the LICENSE file for details.
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
MCP Package Docs Server
Facilitates LLMs to efficiently access and fetch structured documentation for packages in Go, Python, and NPM, enhancing software development with multi-language support and performance optimization.
Claude Code MCP
An implementation of Claude Code as a Model Context Protocol server that enables using Claude's software engineering capabilities (code generation, editing, reviewing, and file operations) through the standardized MCP interface.
@kazuph/mcp-taskmanager
Model Context Protocol server for Task Management. This allows Claude Desktop (or any MCP client) to manage and execute tasks in a queue-based system.
Linear MCP Server
Enables interaction with Linear's API for managing issues, teams, and projects programmatically through the Model Context Protocol.
mermaid-mcp-server
A Model Context Protocol (MCP) server that converts Mermaid diagrams to PNG images.
Jira-Context-MCP
MCP server to provide Jira Tickets information to AI coding agents like Cursor

Linear MCP Server
A Model Context Protocol server that integrates with Linear's issue tracking system, allowing LLMs to create, update, search, and comment on Linear issues through natural language interactions.

Sequential Thinking MCP Server
This server facilitates structured problem-solving by breaking down complex issues into sequential steps, supporting revisions, and enabling multiple solution paths through full MCP integration.