Weather & Web Search Agent
Provides weather information and web search capabilities with special support for Hugging Face inference providers. Integrates seamlessly with tiny-agents framework for conversational AI interactions.
README
MCP Weather & Web Search Agent
A Model Context Protocol (MCP) server that provides weather information and web search capabilities, designed to work with Hugging Face's tiny-agents framework.
Features
- Weather Service: Get weather information for any location
- Web Search: Search the web for information (with special support for Hugging Face inference providers)
- AI Agent Integration: Works seamlessly with tiny-agents for conversational AI
- MCP Inspector Support: Debug and inspect server capabilities
Prerequisites
- Python 3.10 or higher
- uv package manager
- Node.js (for MCP inspector)
- Hugging Face account (for tiny-agents)
Installation
-
Clone or download this project
git clone https://github.com/Deon62/mcp.git cd mcps -
Install Python dependencies
uv pip install mcp[cli] requests -
Install tiny-agents (if not already installed)
pip install tiny-agents
Quick Start
1. Run the MCP Server
Start the MCP server in one terminal:
uv run --with mcp mcp run server.py
The server will start and wait for connections.
2. Run the AI Agent
In another terminal, start the agent:
tiny-agents run agent.json
You should see:
Agent loaded with 3 tools:
• get_weather
• web_search
• get_hf_inference_providers
»
3. Chat with the Agent
Once the agent is running, you can interact with it:
» Hello! Can you help me find information about Hugging Face inference providers?
Available Tools
1. Weather Service
» What's the weather like in New York?
2. Web Search
» Search for "Hugging Face inference providers"
3. HF Inference Providers
» Get me the list of Hugging Face inference providers
Configuration
Agent Configuration (agent.json)
{
"model": "Qwen/Qwen2.5-72B-Instruct",
"provider": "nebius",
"servers": [
{
"type": "stdio",
"command": "uv",
"args": ["run", "--with", "mcp", "mcp", "run", "server.py"]
}
]
}
Server Configuration (server.py)
The server provides three main tools:
get_weather(location)- Returns weather informationweb_search(query)- Performs web searchesget_hf_inference_providers()- Returns comprehensive list of HF inference providers
MCP Inspector Setup
The MCP Inspector allows you to debug and test your MCP server directly.
1. Install MCP Inspector
npm install -g @modelcontextprotocol/inspector
2. Run the Inspector
mcp-inspector
3. Connect to Your Server
In the inspector:
- Click "Add Server"
- Choose "stdio" transport
- Set command:
uv - Set args:
["run", "--with", "mcp", "mcp", "run", "server.py"] - Click "Connect"
4. Test Tools
Once connected, you can:
- View available tools in the sidebar
- Test each tool with different parameters
- See the JSON-RPC communication
- Debug any issues
Example Usage
Weather Queries
» What's the weather in Tokyo?
» Get weather for London
» How's the weather in San Francisco?
Web Search Queries
» Search for "latest AI developments"
» Find information about "MCP protocol"
» Look up "Hugging Face inference providers"
Specific HF Provider Queries
» Show me all Hugging Face inference providers
» What inference providers does HF support?
» List the available HF deployment options
Troubleshooting
Common Issues
-
"ModuleNotFoundError: No module named 'mcp'"
uv pip install mcp[cli] -
"KeyError: 'command'"
- Check your
agent.jsonconfiguration - Ensure the server configuration is correct
- Check your
-
"Connection closed" errors
- Make sure the MCP server is running
- Check that all dependencies are installed
-
Agent shows "0 tools"
- Verify the server is running
- Check the agent.json configuration
- Ensure the server command is correct
Debug Steps
-
Test the server directly:
python server.py -
Check MCP server with inspector:
mcp-inspector -
Verify dependencies:
uv pip list | grep mcp
Project Structure
mcps/
├── server.py # MCP server implementation
├── agent.json # Agent configuration
├── requirements.txt # Python dependencies
├── uv.lock # Dependency lock file
└── README.md # This file
Development
Adding New Tools
To add a new tool to the server:
@mcp.tool()
def your_new_tool(param: str) -> str:
"""Description of what this tool does"""
return f"Result for {param}"
Modifying Agent Configuration
Edit agent.json to:
- Change the AI model
- Add more MCP servers
- Modify server configurations
Hugging Face Inference Providers
The server includes comprehensive information about HF inference providers:
- Amazon SageMaker - Serverless inference with custom Inferentia2 chips
- Novita AI - Integrated serverless inference directly on model pages
- Together AI - Serverless inference with competitive pricing
- Nscale - Official HF provider with high-performance GPU clusters
- Inference Endpoints - Dedicated, fully managed infrastructure
- Google Cloud - Vertex AI and other deployment options
- Microsoft Azure - Azure Machine Learning services
- Replicate - Easy-to-use model deployment platform
- Banana - Serverless GPU inference platform
- Modal - Serverless compute platform
- RunPod - GPU cloud computing
- Lambda Labs - GPU cloud infrastructure
Contributing
- Fork the repository
- Create a feature branch
- Make your changes
- Test with both the agent and inspector
- Submit a pull request
License
[Add your license information here]
Support
For issues and questions:
- Check the troubleshooting section
- Use the MCP inspector to debug
- Open an issue on GitHub
- Check the MCP documentation
**Happy coding! **
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.