Discover Awesome MCP Servers
Extend your agent with 10,441 capabilities via MCP servers.
- All10,441
- Developer Tools3,867
- Search1,714
- Research & Data1,557
- AI Integration Systems229
- Cloud Platforms219
- Data & App Analysis181
- Database Interaction177
- Remote Shell Execution165
- Browser Automation147
- Databases145
- Communication137
- AI Content Generation127
- OS Automation120
- Programming Docs Access109
- Content Fetching108
- Note Taking97
- File Systems96
- Version Control93
- Finance91
- Knowledge & Memory90
- Monitoring79
- Security71
- Image & Video Processing69
- Digital Note Management66
- AI Memory Systems62
- Advanced AI Reasoning59
- Git Management Tools58
- Cloud Storage51
- Entertainment & Media43
- Virtualization42
- Location Services35
- Web Automation & Stealth32
- Media Content Processing32
- Calendar Management26
- Ecommerce & Retail18
- Speech Processing18
- Customer Data Platforms16
- Travel & Transportation14
- Education & Learning Tools13
- Home Automation & IoT13
- Web Search Integration12
- Health & Wellness10
- Customer Support10
- Marketing9
- Games & Gamification8
- Google Cloud Integrations7
- Art & Culture4
- Language Translation3
- Legal & Compliance2
OpenAI MCP Server
镜子 (jìng zi)
MCP Server for Running E2E Tests
端到端 MCP 服务器,用于自动化验证您的人工智能生成的代码。 (Duān dào duān MCP fúwùqì, yòng yú zìdònghuà yànzhèng nín de réngōng zhìnéng shēngchéng de dàimǎ.)
BigQuery Analysis MCP Server
一个服务器,可以针对 Google BigQuery 执行和验证 SQL 查询,并具有安全功能,以防止数据修改和过度处理。
Unreal Engine Generative AI Support Plugin
UnrealMCP 来了!!通过 AI 自动生成蓝图和场景!!一个用于 LLM/GenAI 模型和 MCP UE5 服务器的虚幻引擎插件。支持 Claude Desktop App 和 Cursor,还包括 OpenAI 的 GPT4o、DeepseekR1 和 Claude Sonnet 3.7 API,并计划很快添加 Gemini、Grok 3、音频和实时 API。
mcp-flux-schnell MCP Server
一个基于 TypeScript 的 MCP 服务器,它使用 Cloudflare 的 Flux Schnell 模型 API 来实现文本到图像的生成。
Token Minter MCP
一个 MCP 服务器,为 AI 代理提供跨多个区块链铸造 ERC-20 代币的工具。
DVMCP: Data Vending Machine Context Protocol
DVMCP 是一个桥接实现,它将模型上下文协议 (MCP) 服务器连接到 Nostr 的数据自动售卖机 (DVM) 生态系统。
MCP Server for Stock Market Analysis
NextChat with MCP Server Builder
具有 MCP 服务器创建功能的 NextChat 和 OpenRouter 集成 (Jùyǒu MCP fúwùqì chuàngjiàn gōngnéng de NextChat hé OpenRouter jíchéng)
Python CLI Tool for Generating MCP Servers from API Specs
根据 OpenAPI 或 GraphQL 规范提供的输入,使用 Anthropic 的 SDK 生成一个 MCP 服务器。
LLMling
易于使用的 MCP (模型上下文协议) 服务器和 AI 代理,定义为 YAML。
LiteMCP
一个用于优雅地构建 MCP 服务器的 TypeScript 框架
MCP Server MetaTool
Elixir MCP Server
Okay, here's an example of how you might implement a simplified MCP (Metaverse Communication Protocol) server using Elixir and Server-Sent Events (SSE) for transport. This is a basic illustration and would need significant expansion for a real-world application. **Conceptual Overview** * **MCP (Metaverse Communication Protocol):** In this simplified example, we'll assume MCP messages are JSON objects with a `type` field (e.g., "chat", "location", "event") and a `payload` field containing the data. * **SSE (Server-Sent Events):** A unidirectional protocol where the server pushes updates to connected clients. Suitable for real-time data streams. * **Elixir/Phoenix:** We'll use Elixir (a functional programming language) and the Phoenix framework (a web framework built on Elixir) to handle the server-side logic. * **PubSub:** We'll use Phoenix's built-in PubSub system to broadcast messages to relevant clients. **Code Example (Phoenix Application)** 1. **Create a Phoenix Project:** ```bash mix phx.new mcp_sse --no-ecto cd mcp_sse ``` (The `--no-ecto` flag skips database setup, as we won't be using a database in this simplified example.) 2. **Define a Channel (for SSE):** Create a new channel file: `lib/mcp_sse_web/channels/mcp_channel.ex` ```elixir defmodule McpSseWeb.McpChannel do use Phoenix.Channel require Logger def join("mcp:" <> topic, _payload, socket) do Logger.info("Client joined topic: #{topic}") {:ok, socket} end def handle_in("mcp_message", payload, socket) do # Process the MCP message case process_mcp_message(payload) do {:ok, broadcast_payload} -> # Broadcast the processed message to the appropriate topic broadcast!(socket, "mcp_update", broadcast_payload) {:noreply, socket} {:error, reason} -> Logger.error("Error processing MCP message: #{reason}") {:reply, {:error, reason}, socket} end end defp process_mcp_message(payload) do # Example: Validate the message and potentially transform it try do message = Jason.decode!(payload) case message do %{"type" => type, "payload" => _} -> {:ok, message} # Just pass it through for now _ -> {:error, "Invalid MCP message format"} end rescue _ -> {:error, "Invalid JSON"} end end end ``` 3. **Configure the Channel in `lib/mcp_sse_web/endpoint.ex`:** In the `socket` function, add the channel: ```elixir socket "/socket", McpSseWeb.UserSocket, websocket: true, longpoll: false channel "mcp:*", McpSseWeb.McpChannel ``` 4. **Create a Route for SSE:** In `lib/mcp_sse_web/router.ex`, add a route: ```elixir scope "/", McpSseWeb do pipe_through :browser get "/", PageController, :index get "/sse/:topic", SseController, :stream end ``` 5. **Create an SSE Controller:** Create `lib/mcp_sse_web/controllers/sse_controller.ex`: ```elixir defmodule McpSseWeb.SseController do use McpSseWeb, :controller require Logger def stream(conn, %{"topic" => topic}) do conn |> put_resp_content_type("text/event-stream") |> put_resp_header("cache-control", "no-cache") |> send_chunked(200) |> stream_events(topic) end defp stream_events(conn, topic) do Phoenix.PubSub.subscribe(McpSse.PubSub, "mcp:" <> topic) receive do {:mcp_update, message} -> Logger.info("Sending SSE event to topic #{topic}: #{inspect message}") conn = chunk(conn, "event: message\ndata: #{Jason.encode!(message)}\n\n") stream_events(conn, topic) # Continue listening after :infinity -> Logger.warn("SSE stream timed out for topic #{topic}") conn end end end ``` 6. **Update `application.ex`:** Make sure `McpSse.PubSub` is started in your application's supervision tree. In `lib/mcp_sse/application.ex`: ```elixir def start(_type, _args) do children = [ McpSseWeb.Telemetry, {Phoenix.PubSub, name: McpSse.PubSub}, # Add this line McpSseWeb.Endpoint ] opts = [strategy: :one_for_one, name: McpSse.Supervisor] Supervisor.start_link(children, opts) end ``` 7. **Update `page_controller.ex`:** ```elixir defmodule McpSseWeb.PageController do use McpSseWeb, :controller def index(conn, _params) do render(conn, "index.html") end end ``` 8. **Create `index.html.heex`:** ```html <h1>MCP SSE Example</h1> <div id="sse-output"></div> <script> const topic = "my_topic"; // Replace with your desired topic const eventSource = new EventSource(`/sse/${topic}`); eventSource.onmessage = (event) => { const data = JSON.parse(event.data); const outputDiv = document.getElementById("sse-output"); outputDiv.innerHTML += `<p>Received: ${JSON.stringify(data)}</p>`; }; eventSource.onerror = (error) => { console.error("SSE error:", error); }; </script> ``` **Explanation:** * **`McpChannel`:** Handles WebSocket connections. It receives `mcp_message` events, processes them (in `process_mcp_message`), and then broadcasts the processed message using `Phoenix.PubSub` to a topic. The `process_mcp_message` function is a placeholder for your actual MCP message validation and transformation logic. * **`SseController`:** Handles SSE connections. When a client connects to `/sse/:topic`, the `stream` action sets the correct headers for SSE and then calls `stream_events`. `stream_events` subscribes to the `Phoenix.PubSub` topic and listens for `mcp_update` messages. When a message is received, it formats it as an SSE event and sends it to the client using `chunk`. * **`Phoenix.PubSub`:** A publish-subscribe system. The `McpChannel` publishes messages to topics, and the `SseController` subscribes to those topics. * **Client-Side JavaScript:** The JavaScript in `index.html.heex` creates an `EventSource` that connects to the `/sse/:topic` endpoint. It listens for `message` events and displays the received data in the `sse-output` div. **How to Run:** 1. Install dependencies: `mix deps.get` 2. Start the Phoenix server: `mix phx.server` 3. Open your browser to `http://localhost:4000`. **Testing (Sending MCP Messages):** You can use `iex` to simulate sending MCP messages through the WebSocket channel. 1. Open a new terminal and start `iex`: `iex -S mix phx.server` 2. Connect to the channel: ```elixir {:ok, socket} = Phoenix.Endpoint.broadcast(McpSseWeb.Endpoint, "mcp:my_topic", "mcp_update", %{"type" => "chat", "payload" => %{"message" => "Hello from IEx!"}}) ``` (Replace `"my_topic"` with the topic you're using in your client.) You should see the message appear in the browser. **Important Considerations and Next Steps:** * **Error Handling:** The error handling in this example is very basic. You'll need to add more robust error handling and logging. * **Authentication/Authorization:** This example has no authentication. You'll need to implement authentication and authorization to control who can send and receive messages. Phoenix provides good mechanisms for this. * **Message Validation:** The `process_mcp_message` function is a placeholder. You'll need to implement proper validation of MCP messages to ensure they are well-formed and contain the expected data. * **Scalability:** For a production system, you'll need to consider scalability. Phoenix channels and PubSub are generally scalable, but you may need to use a distributed PubSub implementation (e.g., using Redis) for very high message rates. * **Message Persistence:** This example doesn't persist messages. If you need to store messages, you'll need to integrate a database (e.g., using Ecto). * **MCP Definition:** You'll need a formal definition of your MCP message types and their schemas. Consider using a schema validation library like `conform` or `jason` to enforce the schema. * **Client-Side Reconnection:** The client-side JavaScript should handle reconnection if the SSE connection is lost. The `EventSource` API has built-in reconnection logic, but you may want to customize it. * **Binary Data:** If you need to send binary data, you'll need to encode it (e.g., using Base64) before sending it over SSE. This example provides a starting point for building an MCP server using Elixir and SSE. Remember to adapt it to your specific requirements and add the necessary features for a production-ready system.
eRegulations MCP Server
一个模型上下文协议(Model Context Protocol)服务器的实现,该实现提供结构化的、对人工智能友好的方式来访问电子法规(eRegulations)数据,从而使人工智能模型更容易回答用户关于行政程序的问题。
mcp-weather-server
好的,这是提供天气数据给 LLM 的一个示例模型上下文协议服务器: ```python import asyncio import json import os from typing import Any, Dict, List, Optional from fastapi import FastAPI, HTTPException from pydantic import BaseModel # 模拟天气数据 WEATHER_DATA = { "San Francisco": {"temperature": 15, "condition": "Cloudy"}, "New York": {"temperature": 22, "condition": "Sunny"}, "London": {"temperature": 18, "condition": "Rainy"}, "Tokyo": {"temperature": 25, "condition": "Clear"}, } class ContextRequest(BaseModel): """ LLM 请求上下文信息的请求体。 """ query: str location: Optional[str] = None # 可选的位置信息 class ContextResponse(BaseModel): """ 服务器返回给 LLM 的上下文信息。 """ context: Dict[str, Any] app = FastAPI() @app.post("/context") async def get_context(request: ContextRequest) -> ContextResponse: """ 根据 LLM 的查询请求,提供上下文信息。 """ print(f"Received query: {request.query}") print(f"Received location: {request.location}") location = request.location if not location: # 如果没有提供位置,则尝试从查询中提取 # 这是一个非常简单的示例,实际应用中需要更复杂的 NLP 处理 if "San Francisco" in request.query: location = "San Francisco" elif "New York" in request.query: location = "New York" elif "London" in request.query: location = "London" elif "Tokyo" in request.query: location = "Tokyo" else: raise HTTPException(status_code=400, detail="Location not specified and could not be inferred from query.") if location not in WEATHER_DATA: raise HTTPException(status_code=404, detail=f"Weather data not found for location: {location}") weather = WEATHER_DATA[location] context = { "location": location, "temperature": weather["temperature"], "condition": weather["condition"], } print(f"Returning context: {context}") return ContextResponse(context=context) if __name__ == "__main__": import uvicorn uvicorn.run(app, host="0.0.0.0", port=8000) ``` **代码解释:** 1. **导入必要的库:** - `asyncio`: 用于异步操作。 - `json`: 用于处理 JSON 数据。 - `os`: 用于操作系统相关的功能。 - `typing`: 用于类型提示。 - `fastapi`: 用于创建 API。 - `pydantic`: 用于数据验证和序列化。 2. **模拟天气数据:** - `WEATHER_DATA`: 一个字典,存储了不同城市的天气数据。 这只是一个模拟数据,实际应用中需要从外部 API 或数据库获取。 3. **定义数据模型:** - `ContextRequest`: 定义了 LLM 请求上下文信息的请求体,包含 `query` (LLM 的查询) 和可选的 `location` (位置信息)。 - `ContextResponse`: 定义了服务器返回给 LLM 的上下文信息,包含一个 `context` 字典。 4. **创建 FastAPI 应用:** - `app = FastAPI()`: 创建一个 FastAPI 应用实例。 5. **定义 `/context` 接口:** - `@app.post("/context")`: 定义一个 POST 请求的接口,路径为 `/context`。 - `async def get_context(request: ContextRequest) -> ContextResponse:`: 定义处理请求的异步函数。 - `request: ContextRequest`: 接收请求体,并将其解析为 `ContextRequest` 对象。 - `-> ContextResponse`: 指定函数返回 `ContextResponse` 对象。 6. **处理请求逻辑:** - **打印接收到的查询和位置信息:** 用于调试和日志记录。 - **获取位置信息:** - 首先尝试从 `request.location` 中获取位置信息。 - 如果 `request.location` 为空,则尝试从 `request.query` 中提取位置信息。 这是一个非常简单的示例,实际应用中需要使用更复杂的 NLP 技术来提取位置信息。 - 如果无法获取位置信息,则返回一个 HTTP 400 错误。 - **获取天气数据:** - 检查 `location` 是否在 `WEATHER_DATA` 中。 - 如果 `location` 不在 `WEATHER_DATA` 中,则返回一个 HTTP 404 错误。 - 从 `WEATHER_DATA` 中获取天气数据。 - **构建上下文信息:** - 创建一个 `context` 字典,包含 `location`、`temperature` 和 `condition`。 - **返回上下文信息:** - 创建一个 `ContextResponse` 对象,并将 `context` 字典赋值给它。 - 返回 `ContextResponse` 对象。 7. **运行 FastAPI 应用:** - `if __name__ == "__main__":`: 确保代码只在直接运行脚本时执行,而不是在被导入为模块时执行。 - `uvicorn.run(app, host="0.0.0.0", port=8000)`: 使用 Uvicorn 运行 FastAPI 应用。 - `host="0.0.0.0"`: 允许从任何 IP 地址访问应用。 - `port=8000`: 指定应用监听的端口为 8000。 **如何运行:** 1. **安装依赖:** ```bash pip install fastapi uvicorn pydantic ``` 2. **运行脚本:** ```bash python your_script_name.py ``` 3. **测试接口:** 可以使用 `curl` 或其他 HTTP 客户端来测试接口。 例如: ```bash curl -X POST -H "Content-Type: application/json" -d '{"query": "What is the weather in San Francisco?", "location": "San Francisco"}' http://localhost:8000/context ``` 或者,如果省略 `location`,服务器会尝试从 `query` 中推断: ```bash curl -X POST -H "Content-Type: application/json" -d '{"query": "What is the weather in San Francisco?"}' http://localhost:8000/context ``` **重要说明:** * **真实数据源:** 这个示例使用模拟的天气数据。 在实际应用中,你需要使用真实的天气 API (例如 OpenWeatherMap, AccuWeather) 或数据库来获取数据。 * **NLP 处理:** 从查询中提取位置信息的部分非常简单。 在实际应用中,你需要使用更复杂的 NLP 技术 (例如命名实体识别) 来准确地提取位置信息。 * **错误处理:** 这个示例只包含基本的错误处理。 在实际应用中,你需要添加更完善的错误处理机制,例如日志记录和重试机制。 * **安全性:** 在生产环境中,你需要考虑安全性问题,例如身份验证和授权。 * **可扩展性:** 如果需要处理大量的请求,你需要考虑使用负载均衡和缓存等技术来提高可扩展性。 * **模型上下文协议:** 这个示例符合模型上下文协议的基本要求,即接收 LLM 的查询请求,并返回相关的上下文信息。 你需要根据 LLM 的具体要求来调整请求和响应的格式。 这个示例提供了一个基本的框架,你可以根据自己的需求进行修改和扩展。 希望这个示例对你有所帮助!
Data Visualization MCP Server
镜子 (jìng zi)
Wikipedia MCP Image Crawler
一个维基百科图片搜索工具。它遵循知识共享许可协议,并通过 Claude Desktop/Cline 在你的项目中使用这些图片。
AlphaVantage MCP Server
一个集成了 AlphaVantage 金融数据 API 的 MCP 服务器,提供对股票市场数据、技术指标和基本财务信息的访问。
Jira MCP Server
一个模型上下文协议(Model Context Protocol)服务器,提供与 Jira 的集成,允许大型语言模型通过自然语言与 Jira 项目、看板、迭代和问题进行交互。
GitLab MCP Server Tools
GitLab MCP 服务器实现的配置、适配器和故障排除工具
Anki MCP Server
一个模型上下文协议服务器,允许大型语言模型(LLM)与 Anki 抽认卡软件进行交互,从而实现诸如创建牌组、添加笔记、搜索卡片以及通过自然语言管理抽认卡内容等功能。
MCP Prompt Server
一个基于模型上下文协议的服务器,为代码审查和 API 文档生成等任务提供预定义的提示模板,从而在 Cursor/Windsurf 编辑器中实现更高效的工作流程。

MCP Server Giphy
使人工智能模型能够从 Giphy 搜索、检索和使用 GIF,并具有内容过滤、多种搜索方法和全面的元数据等功能。
LlamaCloud MCP Server
镜子 (jìng zi)
piapi-mcp-server
镜子 (jìng zi)
Linear MCP Server
一个模型上下文协议(Model Context Protocol)服务器,使大型语言模型能够与 Linear 的问题跟踪系统进行交互,从而管理问题、项目、团队和其他 Linear 资源。
MCP Tools
一个命令行界面,用于通过标准输入输出 (stdio) 和 HTTP 传输与 MCP (模型上下文协议) 服务器交互。
Wikipedia
Mattermost MCP Server
一个 MCP 服务器,使 Claude 和其他 MCP 客户端能够与 Mattermost 工作区交互,提供频道管理、消息传递功能和主题监控功能。