mcp-server
MCP server for experimenting with LLM tools
planetf1
README
mcp-server
MCP server for experimenting with LLM tools
This has been created to get an understanding of MCP servers, the protocol, and usage within LLMs. It is not intended for reuse!
Dependencies
- Install 'uv'
- run
uv sync
Unit tests
uv run pytest
Launch the server
uv run mcp dev server.py
(.venv) ➜ mcp-server git:(main) ✗ uv run mcp dev server.py
Starting MCP inspector...
Proxy server listening on port 3000
🔍 MCP Inspector is up and running at http://localhost:5173 🚀
View the tools
Available Tools
Tool | Description | Backend Service | Required Configuration |
---|---|---|---|
add | Simple addition tool | Local computation | None |
calculator | Evaluates mathematical expressions | Local computation | None |
calculate_bmi | Calculates Body Mass Index | Local computation | None |
echo | Returns input text unchanged | Local computation | None |
long_task | Processes files with progress tracking | Local file system | None |
duckduckgo_search | Web search using DuckDuckGo | DuckDuckGo HTML endpoint | None |
wikipedia_search | Searches Wikipedia articles | Wikipedia API | None |
fetch_weather | Gets current weather by location | OpenWeatherMap API | OPENWEATHER_API_KEY |
openmeteo_forecast | Gets detailed weather forecasts | Open-Meteo API | None |
news_search | Searches for recent news articles | NewsAPI | NEWSAPI_KEY |
tavily_search | AI-powered web search | Tavily API | TAVILY_API_KEY |
arxiv_search | Searches academic papers | arXiv API | None |
github_get_file | Retrieves file contents from GitHub | GitHub API | GITHUB_TOKEN |
github_list_issues | Lists issues in a repository | GitHub API | GITHUB_TOKEN |
github_create_issue | Creates a new issue in a repository | GitHub API | GITHUB_TOKEN |
github_list_pull_requests | Lists PRs in a repository | GitHub API | GITHUB_TOKEN |
github_search_code | Searches code on GitHub | GitHub API | GITHUB_TOKEN |
github_user_activity | Gets a user's GitHub activity summary | GitHub API | GITHUB_TOKEN |
create_thumbnail | Creates image thumbnails | Local image processing | None |
Environment Variable Configuration
To use tools that require API keys, add the following to your environment:
# Weather services
export OPENWEATHER_API_KEY="your_openweather_api_key"
# News services
export NEWSAPI_KEY="your_newsapi_key"
# Search services
export TAVILY_API_KEY="your_tavily_api_key"
# GitHub tools
export GITHUB_TOKEN="your_github_personal_access_token"
Sample Chat Application
The repository includes a sample chat application that demonstrates how to use MCP tools with the Ollama LLM service.
Prerequisites
- Install Ollama from https://ollama.ai/
- Pull the granite model: ollama pull granite3.2:latest (or use any other model)
- Install additional dependencies: uv pip install litellm colorama python-dotenv httpx
Configuration
Create a .env file in the project root with your configuration:
# Ollama configuration
OLLAMA_SERVER=http://localhost:11434
OLLAMA_MODEL=granite3.2:latest # Change to any model you have pulled
# MCP server endpoint (default is localhost:3000)
MCP_ENDPOINT=localhost:3000
# Logging configuration
LOG_FILE=chat_interactions.log
# API keys for various services
OPENWEATHER_API_KEY=your_api_key_here
NEWSAPI_KEY=your_api_key_here
TAVILY_API_KEY=your_api_key_here
GITHUB_TOKEN=your_token_here
Launch the Chat Application
First, start the MCP server in one terminal:
uv run mcp dev server.py
Then, run the chat application in another terminal:
python run_chat.py
Interact with the LLM, which now has access to all the tools provided by the MCP server.
Features
- The chat application automatically uses the MCP tools when appropriate
- All interactions are logged to the file specified in LOG_FILE
- Tools will be called when the LLM decides they're needed to answer a question
- Tool parameters are automatically populated based on the LLM's understanding of the query
Caveats
- It doesn't yet work with the default model.... work in progress!
Recommended Servers
Crypto Price & Market Analysis MCP Server
A Model Context Protocol (MCP) server that provides comprehensive cryptocurrency analysis using the CoinCap API. This server offers real-time price data, market analysis, and historical trends through an easy-to-use interface.
MCP PubMed Search
Server to search PubMed (PubMed is a free, online database that allows users to search for biomedical and life sciences literature). I have created on a day MCP came out but was on vacation, I saw someone post similar server in your DB, but figured to post mine.
dbt Semantic Layer MCP Server
A server that enables querying the dbt Semantic Layer through natural language conversations with Claude Desktop and other AI assistants, allowing users to discover metrics, create queries, analyze data, and visualize results.
mixpanel
Connect to your Mixpanel data. Query events, retention, and funnel data from Mixpanel analytics.

Sequential Thinking MCP Server
This server facilitates structured problem-solving by breaking down complex issues into sequential steps, supporting revisions, and enabling multiple solution paths through full MCP integration.

Nefino MCP Server
Provides large language models with access to news and information about renewable energy projects in Germany, allowing filtering by location, topic (solar, wind, hydrogen), and date range.
Vectorize
Vectorize MCP server for advanced retrieval, Private Deep Research, Anything-to-Markdown file extraction and text chunking.
Mathematica Documentation MCP server
A server that provides access to Mathematica documentation through FastMCP, enabling users to retrieve function documentation and list package symbols from Wolfram Mathematica.
kb-mcp-server
An MCP server aimed to be portable, local, easy and convenient to support semantic/graph based retrieval of txtai "all in one" embeddings database. Any txtai embeddings db in tar.gz form can be loaded
Research MCP Server
The server functions as an MCP server to interact with Notion for retrieving and creating survey data, integrating with the Claude Desktop Client for conducting and reviewing surveys.