
Math MCP Server
A tool-augmented AI server that exposes basic math operations (add, subtract, multiply) via FastMCP and Server-Sent Events, allowing LLM agents to discover and use these mathematical tools.
README
Math MCP Example: Server & Client
#############################################
Directory Structure & Environment Setup
#############################################
Recommended project structure:
iceberg-mcp-main/
├── iceberg_mcp/
│ └── math/
│ ├── math_server.py
│ ├── math_clinet.py
│ └── README.md
├── .venv/ # Python virtual environment (recommended)
└── ... # Other project files
Setting up your Python environment
-
Create a virtual environment (recommended):
python3 -m venv .venv source .venv/bin/activate
-
Install required dependencies: (Make sure you are in the root directory or where your requirements are listed)
pip install mcp fastmcp fastapi-mcp langchain-mcp-adapters uvicorn # And any other dependencies your project needs
-
Set your OpenAI API key:
export OPENAI_API_KEY=sk-...your-key-here...
Overview
- Server: Exposes math tools (add, sub, multiply) and prompt templates using FastMCP and SSE (Server-Sent Events).
- Client: Connects to the server, discovers available tools, and uses an LLM agent to invoke those tools.
Learning Objectives
- Understand how to register and expose tools in a Python server.
- Learn how to connect to a tool server and discover available tools.
- See how an LLM agent can use external tools to answer questions.
- Practice async programming and client-server communication.
#############################################
Server: math_server.py
#############################################
What does it do?
- Registers three math tools:
add
,sub
,multiply
. - Registers three prompt templates for natural language generation.
- Exposes an ASGI app for Uvicorn to serve via SSE.
- Logs every tool and prompt call for transparency.
Key code sections
from mcp.server.fastmcp import FastMCP
import logging
# Set up logging
logging.basicConfig(level=logging.INFO)
logger = logging.getLogger("math_server")
# Create the server
mcp = FastMCP("Math")
app = mcp.sse_app # Expose the SSE ASGI app
# Register tools
@mcp.tool()
def add(a: int, b: int) -> int:
result = a + b
logger.info(f"add({a}, {b}) = {result}")
return result
# ... (sub, multiply, and prompts similar)
print("Registered tools:", mcp.list_tools())
>>>>>>>>>>>>>>>>>>>>>>>> How to run the server===================================================================================
From the iceberg_mcp/math
directory:
uvicorn math_server:app --port 3000
Or from the project root:
uvicorn iceberg_mcp.math.math_server:app --port 3000
#############################################
Client: math_clinet.py
#############################################
What does it do?
- Connects to the math server using SSE.
- Discovers available tools.
- Uses a LangChain agent to ask the server to add 3 and 5.
- Prints only the final answer from the agent's response.
Key code sections
from langchain_mcp_adapters.client import MultiServerMCPClient
from langgraph.prebuilt import create_react_agent
import asyncio
client = MultiServerMCPClient({
"math": {
"transport": "sse",
"url": "http://localhost:3000/sse"
},
})
async def main():
tools = await client.get_tools()
print("Discovered tools:", tools)
agent = create_react_agent("openai:gpt-4.1", tools)
math_response = await agent.ainvoke({"messages": "use the add tool to add 3 and 5"})
if 'messages' in math_response and len(math_response['messages']) > 1:
ai_message = math_response['messages'][-1]
print(ai_message.content)
else:
print(math_response)
if __name__ == "__main__":
asyncio.run(main())
OpenAI API Key Setup
To use the LLM agent (e.g., GPT-4), you need an OpenAI API key. This is required for the client to access OpenAI's language models.
How to set your OpenAI API key:
- The recommended way is to set the
OPENAI_API_KEY
environment variable in your shell:
export OPENAI_API_KEY=sk-...your-key-here...
- Alternatively, you can set it in your Python code (not recommended for production):
import os
os.environ["OPENAI_API_KEY"] = "sk-...your-key-here..."
You must set the API key before running the client, or you will get authentication errors.
>>>>>>>>>>>>>>>>>>> How to run the client################################################################
From the iceberg_mcp/math
directory:
python math_clinet.py
Experiment & Learn
- Try changing the numbers in the client prompt.
- Add new tools (e.g., division) to the server and see if the client discovers them.
- Add more prompts or logging to see how the server responds.
- Explore how async programming enables real-time tool discovery and invocation.
Troubleshooting
- If the client prints
[]
for tools, check server logs and package versions. - Make sure both server and client use compatible MCP and adapter versions.
- Ensure the server is running before starting the client.
Summary
This example demonstrates how to:
- Build a tool-augmented AI server in Python
- Connect and interact with it using a modern LLM agent
- Use async programming for efficient, real-time communication
Happy learning! # mcp
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.