MCP-ChatBot
Simple MCP Client-Server example
muralianand12345
README
MCP-ChatBot
A versatile chatbot application that uses MCP (Modular Capability Protocol) to interact with multiple service backends.
Overview
MCP-ChatBot is a containerized application that demonstrates the use of Modular Capability Protocol (MCP) to enable LLM interactions with external services. This implementation includes a weather service backend and a Streamlit-based frontend that allows users to query for weather information through natural language.
Features
- Containerized Architecture: Separate containers for the MCP server and client application
- Weather Service Integration: Real-time weather data using WeatherAPI
- Streamlit UI: Clean, responsive user interface for interacting with the chatbot
- Extensible Design: Ready to add more MCP servers for additional capabilities
- GPT-4o Integration: Powered by OpenAI's GPT-4o model for natural language understanding
Architecture
The application consists of two main components:
- MCP Server: A FastMCP-based service that handles weather data retrieval
- Streamlit Client: A web-based UI for interacting with the chatbot
Getting Started
Prerequisites
- Docker and Docker Compose
- WeatherAPI Key (sign up at WeatherAPI)
- OpenAI API Key (sign up at OpenAI)
Installation
- Clone the repository:
git clone https://github.com/yourusername/MCP-ChatBot.git
cd MCP-ChatBot
- Create an
.env
file based on the example:
cp .env.example .env
- Edit the
.env
file and add your API keys:
PYTHONUNBUFFERED=1
OPENAI_API_KEY=your_openai_api_key_here
WEATHER_API_KEY=your_weather_api_key_here
- Launch the application using Docker Compose:
docker-compose up --build
- Access the Streamlit UI at:
http://localhost:8501
Usage
Once the application is running, you can interact with the chatbot through the Streamlit interface:
- Type natural language queries about weather in the text input
- Examples:
- "What's the weather like in New York?"
- "How hot is it in Tokyo right now?"
- "Tell me about the weather in London"
Extending the Application
Adding New MCP Servers
- Create a new server file in the
servers
directory - Add the new service to the
docker-compose.yml
file - Update the client.py to include the new server in the agent configuration
Example of adding a new server in client.py:
server_1 = MCPServerHTTP(url="http://mcp_server:8001/sse")
server_2 = MCPServerHTTP(url="http://new_server:8002/sse") # New server
return Agent("openai:gpt-4o", mcp_servers=[server_1, server_2])
Development
Project Structure
MCP-ChatBot/
├── .env.example # Example environment variables
├── .gitignore # Git ignore file
├── client.py # Streamlit client application
├── docker-compose.yml # Docker Compose configuration
├── Dockerfile.client # Dockerfile for Streamlit client
├── Dockerfile.server # Dockerfile for MCP server
├── LICENSE # MIT license
├── README.md # Project documentation
└── servers/ # MCP server implementations
└── server.py # Weather service implementation
Technology Stack
- FastMCP: Framework for creating MCP servers
- Streamlit: Web framework for the UI
- Pydantic-AI: Agent system for LLM interactions
- Docker: Containerization platform
- OpenAI GPT-4o: LLM for natural language processing
Troubleshooting
Common Issues
-
Connection Failed: Ensure that all services are up and running. The client has a retry mechanism, but if it fails, restart the application.
-
API Key Errors: Verify that you've added valid API keys to the
.env
file. -
Docker Network Issues: If containers can't communicate, check the Docker network configuration and ensure the service names match in the code.
License
This project is licensed under the MIT License - see the LICENSE file for details.
Acknowledgments
- FastMCP for the MCP server implementation
- Streamlit for the frontend framework
- WeatherAPI for weather data
Created by Murali Anand © 2025
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
MCP Package Docs Server
Facilitates LLMs to efficiently access and fetch structured documentation for packages in Go, Python, and NPM, enhancing software development with multi-language support and performance optimization.
Claude Code MCP
An implementation of Claude Code as a Model Context Protocol server that enables using Claude's software engineering capabilities (code generation, editing, reviewing, and file operations) through the standardized MCP interface.
@kazuph/mcp-taskmanager
Model Context Protocol server for Task Management. This allows Claude Desktop (or any MCP client) to manage and execute tasks in a queue-based system.
Linear MCP Server
Enables interaction with Linear's API for managing issues, teams, and projects programmatically through the Model Context Protocol.
mermaid-mcp-server
A Model Context Protocol (MCP) server that converts Mermaid diagrams to PNG images.
Jira-Context-MCP
MCP server to provide Jira Tickets information to AI coding agents like Cursor

Linear MCP Server
A Model Context Protocol server that integrates with Linear's issue tracking system, allowing LLMs to create, update, search, and comment on Linear issues through natural language interactions.

Sequential Thinking MCP Server
This server facilitates structured problem-solving by breaking down complex issues into sequential steps, supporting revisions, and enabling multiple solution paths through full MCP integration.