
MCP Weather Demo
A demonstration server for ModelContextProtocol that provides weather tools and prompts for Claude AI, allowing LLMs to access external data without manual copying.
README
mcp
A demo of the ModelContextProtocol (MCP) using the Anthropic AI SDK.
Built following the Server Quickstart and the Client Quickstart.
Getting Started
Prerequisites
- Bun (v1.2.17)
- asdf (optional)
- asdf-bun (optional)
- Claude Desktop (required if only running the server)
Installation
If using asdf
, run:
asdf install
Regardless of whether using asdf
or not, install dependencies:
bun install
Server
The server allows a client to access resources, tools, and prompts.
It needs a client to interact with an LLM. Claude Desktop serves as the client if you are only running the server.
Set Up
You need to add the server to Claude Desktop by modifying the claude_desktop_config.json
file in your Library/Application Support/Claude
directory. If this file does not exist, you can create it.
vi ~/.config/Claude/claude_desktop_config.json
Add the following to the file:
{
"mcpServers": {
"weather": {
"command": "/ABSOLUTE/PATH/TO/bin/bun",
"args": ["/ABSOLUTE/PATH/TO/src/server/index.ts"]
}
}
}
⚠️ Bun Path
If you are using asdf
, you will need to use the absolute path to the bun
executable. You can find this by running asdf where bun
.
If you're just using bun
without asdf
, you can use bun
as the command.
Run
Once you've modified the claude_desktop_config.json
file, restart Claude Desktop.
You should now see the weather
tools and prompts in Claude Desktop!
Client
Instead of using Claude Desktop, you can also run a client to handle the interaction with the LLM.
This would suitable for building a chat interface or web application that uses Anthropic's API. With MCP, you can give the LLM access to data without having to manually copy and paste them into a prompt.
Set Up
Get an Anthropic API key from the Anthropic API Keys page.
Create a .env
file in the root of the project and add the following:
ANTHROPIC_API_KEY=your_api_key_here
Run
Now you can run the client:
bun run dev
This gives you an interactive CLI where you can ask the LLM questions. Note that you have access to the tools defined in the server!
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.