Python MCP Server Template
A foundational template for building MCP servers in Python using Streamable HTTP transport. Provides example implementations of tools, resources, and prompts to help developers create custom MCP integrations for AI assistants.
README
mcp-server-template-python
A very simple Python template for building MCP servers using Streamable HTTP transport.
Overview
This template provides a foundation for creating MCP servers that can communicate with AI assistants and other MCP clients. It includes a simple HTTP server implementation with example tools, resources & prompts to help you get started building your own MCP integrations.
Prerequisites
- Install uv (https://docs.astral.sh/uv/getting-started/installation/)
Installation
- Clone the repository:
git clone git@github.com:alpic-ai/mcp-server-template-python.git
cd mcp-server-template-python
- Install python version & dependencies:
uv python install
uv sync --locked
Usage
Start the server on port 3000:
uv run main.py
Running the Inspector
Requirements
- Node.js: ^22.7.5
Quick Start (UI mode)
To get up and running right away with the UI, just execute the following:
npx @modelcontextprotocol/inspector
The inspector server will start up and the UI will be accessible at http://localhost:6274.
You can test your server locally by selecting:
- Transport Type: Streamable HTTP
- URL: http://127.0.0.1:3000/mcp
Development
Adding New Tools
To add a new tool, modify main.py:
@mcp.tool(
title="Your Tool Name",
description="Tool Description for the LLM",
)
async def new_tool(
tool_param1: str = Field(description="The description of the param1 for the LLM"),
tool_param2: float = Field(description="The description of the param2 for the LLM")
)-> str:
"""The new tool underlying method"""
result = await some_api_call(tool_param1, tool_param2)
return result
Adding New Resources
To add a new resource, modify main.py:
@mcp.resource(
uri="your-scheme://{param1}/{param2}",
description="Description of what this resource provides",
name="Your Resource Name",
)
def your_resource(param1: str, param2: str) -> str:
"""The resource template implementation"""
# Your resource logic here
return f"Resource content for {param1} and {param2}"
The URI template uses {param_name} syntax to define parameters that will be extracted from the resource URI and passed to your function.
Adding New Prompts
To add a new prompt , modify main.py:
@mcp.prompt("")
async def your_prompt(
prompt_param: str = Field(description="The description of the param for the user")
) -> str:
"""Generate a helpful prompt"""
return f"You are a friendly assistant, help the user and don't forget to {prompt_param}."
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.