Explore MCP
A demonstration MCP server that exposes basic arithmetic tools (add, subtract, ping) through FastAPI and shows how to integrate them with OpenAI's tool-calling API for LLM orchestration.
README
Explore MCP
This repo runs a local MCP (Model Context Protocol) server, verifies tool connectivity with a client, and demonstrates OpenAI tool-calling integration.
Prerequisites
- Python 3.13 (per
pyproject.toml). - Linux/macOS shell (commands shown for bash).
- An OpenAI API key.
- Dependencies listed in
pyproject.toml.
Install dependencies
You can use uv, poetry, or pip. Pick one approach.
Option A: uv (fast, recommended)
# If uv is not installed, see https://docs.astral.sh/uv/getting-started/
uv sync
Option B: poetry
poetry install
poetry shell
Option C: pip + venv
python -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt .
Set OpenAI API key
The script utils/helpers.py expects your key in a file at:
~/.llm_secretswith content like:
OPENAI_API_KEY=sk-...
Alternatively, you can set the environment variable directly in your shell before running the OpenAI test:
export OPENAI_API_KEY=sk-...
Note: call_my_mcp2.py imports get_api_key() and sets os.environ["OPENAI_API_KEY"] internally using the ~/.llm_secrets file. If that file is missing, set the env var as shown above or create the file.
Start the MCP server
This launches the FastAPI app serving the MCP endpoint at http://127.0.0.1:8000/mcp.
python mcp_server.py
Expected log: uvicorn starts and listens on port 8000.
Health check: MCP client
In a separate terminal (while the server is running), verify connectivity and tool discovery.
python mcp_client_check.py
Expected output: A tools list including ping, add, and subtract.
OpenAI integration test
With the server running and your OpenAI key available, run the LLM orchestration demo:
python mcp_llm_mode2.py
What it does:
- Discovers MCP tools from the server.
- Provides those tools to the OpenAI Chat Completions API.
- Lets the model call
addand returns the final answer.
Expected: The printed result of adding two numbers (e.g., 68 for 23 + 45).
Notes
- The script uses model
gpt-5. If this model is not available on your account, change it insidecall_my_mcp2.pyto a model you have access to (e.g.,gpt-4.1-minior another supported tools-capable model). - Ensure the server URL in
call_my_mcp2.pyandmcp_client_check.pymatcheshttp://127.0.0.1:8000/mcp. - If port 8000 is in use, edit
mcp_server.pyto run uvicorn on a different port and update the client scripts accordingly.
Files
mcp_server.py: FastAPI + FastMCP server exposing tools:ping,add,subtract.mcp_client_check.py: Async client that pings the server and lists tools.call_my_mcp2.py: Orchestrates an OpenAI chat with MCP tools.utils/helpers.py: Utility to load the OpenAI API key from~/.llm_secretsor set env vars from a JSON file.
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
E2B
Using MCP to run code via e2b.