MCP Server Boilerplate
A minimal, well-documented MCP server boilerplate providing a reusable baseline with tools, resources, prompts, and extensive documentation for building custom MCP servers.
README
MCP Server Boilerplate
A minimal, well-documented MCP (Model Context Protocol) server implementation designed to serve as a reusable baseline for building custom MCP servers.
What is MCP?
The Model Context Protocol (MCP) is a standardized protocol that enables AI assistants to interact with external servers. MCP servers can provide:
- Tools: Functions that the AI can call to perform actions
- Resources: Static or dynamic data that the AI can read
- Prompts: Reusable prompt templates for consistent AI interactions
Features
This boilerplate provides:
- Minimal structure: Clean baseline that can be easily extended
- Extensive documentation: Inline comments and separate documentation files
- Architecture diagrams: Mermaid diagrams showing component interactions
- Scaling guide: Best practices for growing your server
- Type hints: Full type annotations for better IDE support
- Async/await: Non-blocking I/O for concurrent operations
Reusable Prompt Templates
Prompts are reusable prompt templates that allow you to define structured prompts with placeholders. They enable:
- Consistency: Standardized prompt formats across different AI interactions
- Parameterization: Dynamic content insertion through arguments
- Reusability: Define once, use multiple times with different inputs
- Type safety: Defined argument schemas with validation
A prompt template consists of:
- Name: Unique identifier for the prompt
- Description: What the prompt does
- Arguments: Optional parameters that can be filled in when using the prompt
Example use cases:
- Code review templates with configurable severity levels
- Documentation generation with customizable tone
- Analysis prompts with variable focus areas
- Report generation with different output formats
Project Structure
windsurf-project-3/
├── mcp_server.py # Main server implementation with extensive comments
├── pyproject.toml # Project configuration for uv
├── ARCHITECTURE.md # Architecture documentation with Mermaid diagrams
├── SCALING_GUIDE.md # Scaling patterns and best practices
├── README.md # This file
├── tools/ # Placeholder for tool modules (create as needed)
├── resources/ # Placeholder for resource modules (create as needed)
├── prompts/ # Placeholder for prompt modules (create as needed)
└── utils/ # Placeholder for utility modules (create as needed)
Installation
This project uses uv for fast Python package management.
- Install Python 3.10 or higher
- Install uv (if not already installed):
curl -LsSf https://astral.sh/uv/install.sh | sh
- Install dependencies:
uv sync
Quick Start
1. Add Your First Tool
Edit mcp_server.py and add a tool in the list_tools() function:
@app.list_tools()
async def list_tools() -> list[Tool]:
return [
Tool(
name="echo",
description="Echo back the input text",
inputSchema={
"type": "object",
"properties": {
"text": {"type": "string", "description": "Text to echo"}
},
"required": ["text"]
}
)
]
2. Implement the Tool Handler
Add the tool logic in the call_tool() function:
@app.call_tool()
async def call_tool(name: str, arguments: Any) -> str:
if name == "echo":
text = arguments.get("text", "")
return f"Echo: {text}"
raise ValueError(f"Unknown tool: {name}")
3. Add a Prompt (Optional)
Add a prompt in the list_prompts() function:
@app.list_prompts()
async def list_prompts() -> list[Prompt]:
return [
Prompt(
name="example_prompt",
description="An example prompt template",
arguments=[
PromptArgument(
name="topic",
description="The topic to write about",
required=True
)
]
)
]
Then implement the handler in get_prompt():
@app.get_prompt()
async def get_prompt(name: str, arguments: dict[str, str] | None) -> str:
if name == "example_prompt":
topic = arguments.get("topic") if arguments else None
if not topic:
raise ValueError("Argument 'topic' is required")
return f"Write a detailed explanation about {topic}."
raise ValueError(f"Unknown prompt: {name}")
3. Run the Server
uv run python mcp_server.py
4. Configure Your MCP Client
Add this to your MCP client's configuration:
{
"mcpServers": {
"your-server-name": {
"command": "uv",
"args": ["run", "python", "/path/to/mcp_server.py"]
}
}
}
Documentation
-
ARCHITECTURE.md: Detailed architecture documentation with Mermaid diagrams showing:
- Python modules and their purposes
- Component interactions
- Request flows (tool invocation, resource reading)
- Design patterns used
-
SCALING_GUIDE.md: Best practices for scaling your server:
- Modularization patterns
- State management strategies
- Error handling patterns
- Logging and monitoring
- Configuration management
- Testing strategies
- Performance optimization
- Security considerations
Code Structure
The main server file (mcp_server.py) is organized into sections:
- Server Initialization: Create the MCP server instance
- Tool Registration: Define available tools
- Tool Handlers: Implement tool execution logic
- Resource Registration: Define available resources
- Resource Handlers: Implement resource reading logic
- Entry Point: Start the server with stdio communication
Each section includes extensive inline comments explaining the purpose and usage of each component.
Extension Points
Adding Tools
- Define the tool in
list_tools()with its schema - Implement the handler in
call_tool() - For larger projects, move to separate module in
tools/directory
Adding Prompts
- Define the prompt in
list_prompts()with its arguments - Implement the handler in
get_prompt() - For larger projects, move to separate module in
prompts/directory
Adding Resources
- Define the resource in
list_resources()with its metadata - Implement the handler in
read_resource() - For larger projects, move to separate module in
resources/directory
Adding Utilities
Extract shared code into the utils/ directory:
- Validation functions
- Logging helpers
- Configuration management
- Error handling utilities
Using as a Baseline
This boilerplate is designed to be copied and modified for new projects:
- Copy the entire project directory
- Rename the project in
pyproject.toml - Update the server name in
mcp_server.py - Add your tools, resources, and prompts
- Customize documentation as needed
Python Modules Used
mcp.server.Server: Main MCP server classmcp.types.Tool: Tool type definitionmcp.types.Resource: Resource type definitionmcp.types.Prompt: Prompt type definitionmcp.types.PromptArgument: Prompt argument type definitionmcp.server.stdio: Stdio communication streamsasyncio: Async/await for concurrent operationstyping: Type hints for code clarity
See ARCHITECTURE.md for detailed explanations of each module.
Development
Running Tests
# Run with pytest (add tests first)
uv run pytest
Code Style
This project uses Python type hints and follows PEP 8 conventions. Consider using:
rufffor lintingmypyfor type checking
Adding Dependencies
uv add <package-name>
Troubleshooting
- Import error: Run
uv syncto install dependencies - Server not responding: Check MCP client configuration
- Type errors: Ensure Python 3.10+ is installed
- uv command not found: Install uv from https://github.com/astral-sh/uv
Resources
License
This boilerplate is provided as-is for educational and development purposes. Feel free to use and modify it for your projects.
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.