Weather MCP Server

Weather MCP Server

Provides real-time weather alerts from the National Weather Service API via the Model Context Protocol. It allows LLMs to fetch active alerts for specific US states using both STDIO and SSE transport protocols.

Category
Visit Server

README

Weather MCP Server & Client

A Model Context Protocol (MCP) implementation that exposes weather alerts from the National Weather Service API as tools. Includes an interactive chat client powered by Groq LLM and test scripts for both STDIO and SSE transports.

Table of Contents


Prerequisites

  • Python 3.13+ (see pyproject.toml)
  • uv – Python package manager (install)
  • Groq API Key – for the LLM (get one)

Setup

  1. Clone or navigate to the project:

    cd /path/to/MCP
    
  2. Create a .env file with your Groq API key:

    GROQ_API_KEY=your_groq_api_key_here
    
  3. Install dependencies with uv:

    uv sync
    

    This creates a .venv and installs all dependencies from pyproject.toml. The project uses langchain>=1.2.0,<2.0.0 for compatibility with mcp-use.


Configuration Files

File Purpose
server/weather_mcp.json Default config for stdio: spawns uv run server/weather.py
server/weather_stdio_config.json Same as above; used by stdio test script
server/weather_sse_config.json SSE config: connects to http://127.0.0.1:8000/sse

Config Format

STDIO (command-based):

{
  "mcpServers": {
    "weather": {
      "command": "uv",
      "args": ["run", "server/weather.py"]
    }
  }
}

SSE (URL-based):

{
  "mcpServers": {
    "weather": {
      "url": "http://127.0.0.1:8000/sse"
    }
  }
}

Use Cases & How to Run

All commands use uv run to execute scripts with the project's virtual environment and dependencies.

Case 1: Interactive Chat Client (STDIO)

Run an interactive chat that connects to the weather MCP server. The agent can call get_alerts when you ask about weather.

Run:

uv run client_chatbot.py

Commands:

  • Type any question (e.g., "What are the weather alerts for California?")
  • exit or quit – end the session
  • clear – clear conversation history

What happens:

  • Client reads server/weather_mcp.json
  • Spawns uv run server/weather.py as a subprocess
  • Communicates via stdin/stdout (stdio transport)
  • Agent uses Groq LLM and calls get_alerts when needed

Case 2: STDIO Transport Test

Explicit test of the stdio transport. Same behavior as Case 2 but uses a dedicated config file.

Run:

uv run test_weather_stdio.py

What happens:

  • Uses server/weather_stdio_config.json
  • Client spawns uv run server/weather.py
  • Runs one query and exits

Case 3: SSE Transport Test

Test the SSE (HTTP) transport. The script starts the server in HTTP mode, connects via URL, runs a query, then shuts down the server.

Run:

uv run test_weather_sse.py

What happens:

  1. Starts server/weather_sse.py in the background (listens on port 8000)
  2. Waits for the server to be ready
  3. Connects to http://127.0.0.1:8000/sse
  4. Runs one query
  5. Terminates the server process

Case 4: Run Weather Server Manually (SSE)

Run the weather server as a standalone HTTP process. Useful for debugging or when multiple clients need to connect.

Run:

# Terminal 1: Start the server
uv run server/weather_sse.py

Then in another terminal, use a client configured with url: "http://127.0.0.1:8000/sse" (e.g., test_weather_sse.py or a custom client with weather_sse_config.json).


Case 6: Run Weather Server Manually (STDIO)

Running the stdio server directly is rarely useful because it expects JSON-RPC on stdin. It's normally spawned by the client.

For debugging:

uv run server/weather.py
# Server waits for stdin; won't do anything useful without a client

Project Structure

MCP/
├── README.md                 # This file
├── .env                      # GroQ API key (create this)
├── pyproject.toml            # Project dependencies
├── client_chatbot.py         # Interactive chat client (stdio)
├── test_weather.py           # Quick one-shot test (stdio)
├── test_weather_stdio.py     # STDIO transport test
├── test_weather_sse.py       # SSE transport test
├── server/
│   ├── weather.py            # MCP server (STDIO transport)
│   ├── weather_sse.py        # MCP server (SSE transport)
│   ├── weather_mcp.json      # Default client config (stdio)
│   ├── weather_stdio_config.json
│   └── weather_sse_config.json
└── src/
    └── app_mcp/
        └── __init__.py       # Package init

MCP Tools Exposed

Tool Description Args
get_alerts Get active weather alerts for a US state state (e.g., "CA", "NY")

MCP Resources

URI Description
echo://{message} Echo a message (for testing)

Troubleshooting

"No module named 'mcp_use'"

Run uv sync to install dependencies, then use uv run for all scripts:

uv sync
uv run client_chatbot.py

"ModuleNotFoundError: No module named 'httpx'"

Dependencies are in pyproject.toml. Run uv sync to install.

"qwen-qwq-32b has been decommissioned"

The project uses llama-3.3-70b-versatile. If you see references to qwen-qwq-32b, update the model in the client/test scripts.

"Connection closed" or "Invalid JSON"

Avoid printing to stdout from code that runs in the MCP server process. Stdio uses stdin/stdout for JSON-RPC; any extra output breaks the protocol.

Port 8000 already in use

Stop the process using port 8000, or change the port in server/weather_sse.py:

mcp = FastMCP("weather", host="127.0.0.1", port=8001)

Then update SSE_SERVER_PORT in test_weather_sse.py and the URL in weather_sse_config.json.

Sandbox / uv cache errors

Run with full permissions if you see "Operation not permitted" for uv cache:

uv run client_chatbot.py  # may need to run outside sandbox

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
E2B

E2B

Using MCP to run code via e2b.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured