Weather & Web Search Agent

Weather & Web Search Agent

Provides weather information and web search capabilities with special support for Hugging Face inference providers. Integrates seamlessly with tiny-agents framework for conversational AI interactions.

Category
Visit Server

README

MCP Weather & Web Search Agent

A Model Context Protocol (MCP) server that provides weather information and web search capabilities, designed to work with Hugging Face's tiny-agents framework.

Features

  • Weather Service: Get weather information for any location
  • Web Search: Search the web for information (with special support for Hugging Face inference providers)
  • AI Agent Integration: Works seamlessly with tiny-agents for conversational AI
  • MCP Inspector Support: Debug and inspect server capabilities

Prerequisites

  • Python 3.10 or higher
  • uv package manager
  • Node.js (for MCP inspector)
  • Hugging Face account (for tiny-agents)

Installation

  1. Clone or download this project

    git clone https://github.com/Deon62/mcp.git
    cd mcps
    
  2. Install Python dependencies

    uv pip install mcp[cli] requests
    
  3. Install tiny-agents (if not already installed)

    pip install tiny-agents
    

Quick Start

1. Run the MCP Server

Start the MCP server in one terminal:

uv run --with mcp mcp run server.py

The server will start and wait for connections.

2. Run the AI Agent

In another terminal, start the agent:

tiny-agents run agent.json

You should see:

Agent loaded with 3 tools:
 • get_weather
 • web_search
 • get_hf_inference_providers
»

3. Chat with the Agent

Once the agent is running, you can interact with it:

» Hello! Can you help me find information about Hugging Face inference providers?

Available Tools

1. Weather Service

» What's the weather like in New York?

2. Web Search

» Search for "Hugging Face inference providers"

3. HF Inference Providers

» Get me the list of Hugging Face inference providers

Configuration

Agent Configuration (agent.json)

{
    "model": "Qwen/Qwen2.5-72B-Instruct",
    "provider": "nebius",
    "servers": [
        {
            "type": "stdio",
            "command": "uv",
            "args": ["run", "--with", "mcp", "mcp", "run", "server.py"]
        }
    ]
}

Server Configuration (server.py)

The server provides three main tools:

  • get_weather(location) - Returns weather information
  • web_search(query) - Performs web searches
  • get_hf_inference_providers() - Returns comprehensive list of HF inference providers

MCP Inspector Setup

The MCP Inspector allows you to debug and test your MCP server directly.

1. Install MCP Inspector

npm install -g @modelcontextprotocol/inspector

2. Run the Inspector

mcp-inspector

3. Connect to Your Server

In the inspector:

  1. Click "Add Server"
  2. Choose "stdio" transport
  3. Set command: uv
  4. Set args: ["run", "--with", "mcp", "mcp", "run", "server.py"]
  5. Click "Connect"

4. Test Tools

Once connected, you can:

  • View available tools in the sidebar
  • Test each tool with different parameters
  • See the JSON-RPC communication
  • Debug any issues

Example Usage

Weather Queries

» What's the weather in Tokyo?
» Get weather for London
» How's the weather in San Francisco?

Web Search Queries

» Search for "latest AI developments"
» Find information about "MCP protocol"
» Look up "Hugging Face inference providers"

Specific HF Provider Queries

» Show me all Hugging Face inference providers
» What inference providers does HF support?
» List the available HF deployment options

Troubleshooting

Common Issues

  1. "ModuleNotFoundError: No module named 'mcp'"

    uv pip install mcp[cli]
    
  2. "KeyError: 'command'"

    • Check your agent.json configuration
    • Ensure the server configuration is correct
  3. "Connection closed" errors

    • Make sure the MCP server is running
    • Check that all dependencies are installed
  4. Agent shows "0 tools"

    • Verify the server is running
    • Check the agent.json configuration
    • Ensure the server command is correct

Debug Steps

  1. Test the server directly:

    python server.py
    
  2. Check MCP server with inspector:

    mcp-inspector
    
  3. Verify dependencies:

    uv pip list | grep mcp
    

Project Structure

mcps/
├── server.py          # MCP server implementation
├── agent.json         # Agent configuration
├── requirements.txt   # Python dependencies
├── uv.lock           # Dependency lock file
└── README.md         # This file

Development

Adding New Tools

To add a new tool to the server:

@mcp.tool()
def your_new_tool(param: str) -> str:
    """Description of what this tool does"""
    return f"Result for {param}"

Modifying Agent Configuration

Edit agent.json to:

  • Change the AI model
  • Add more MCP servers
  • Modify server configurations

Hugging Face Inference Providers

The server includes comprehensive information about HF inference providers:

  1. Amazon SageMaker - Serverless inference with custom Inferentia2 chips
  2. Novita AI - Integrated serverless inference directly on model pages
  3. Together AI - Serverless inference with competitive pricing
  4. Nscale - Official HF provider with high-performance GPU clusters
  5. Inference Endpoints - Dedicated, fully managed infrastructure
  6. Google Cloud - Vertex AI and other deployment options
  7. Microsoft Azure - Azure Machine Learning services
  8. Replicate - Easy-to-use model deployment platform
  9. Banana - Serverless GPU inference platform
  10. Modal - Serverless compute platform
  11. RunPod - GPU cloud computing
  12. Lambda Labs - GPU cloud infrastructure

Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes
  4. Test with both the agent and inspector
  5. Submit a pull request

License

[Add your license information here]

Support

For issues and questions:

  1. Check the troubleshooting section
  2. Use the MCP inspector to debug
  3. Open an issue on GitHub
  4. Check the MCP documentation

**Happy coding! **

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
E2B

E2B

Using MCP to run code via e2b.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured