Discover Awesome MCP Servers
Extend your agent with 14,392 capabilities via MCP servers.
- All14,392
- Developer Tools3,867
- Search1,714
- Research & Data1,557
- AI Integration Systems229
- Cloud Platforms219
- Data & App Analysis181
- Database Interaction177
- Remote Shell Execution165
- Browser Automation147
- Databases145
- Communication137
- AI Content Generation127
- OS Automation120
- Programming Docs Access109
- Content Fetching108
- Note Taking97
- File Systems96
- Version Control93
- Finance91
- Knowledge & Memory90
- Monitoring79
- Security71
- Image & Video Processing69
- Digital Note Management66
- AI Memory Systems62
- Advanced AI Reasoning59
- Git Management Tools58
- Cloud Storage51
- Entertainment & Media43
- Virtualization42
- Location Services35
- Web Automation & Stealth32
- Media Content Processing32
- Calendar Management26
- Ecommerce & Retail18
- Speech Processing18
- Customer Data Platforms16
- Travel & Transportation14
- Education & Learning Tools13
- Home Automation & IoT13
- Web Search Integration12
- Health & Wellness10
- Customer Support10
- Marketing9
- Games & Gamification8
- Google Cloud Integrations7
- Art & Culture4
- Language Translation3
- Legal & Compliance2
MCP LLM Bridge
实现MCP,以支持MCP服务器和兼容OpenAI的LLM之间的通信。
File Converter MCP Server
一个 MCP 服务器,为 AI 代理提供多种文件转换工具,支持各种文档和图像格式转换,包括 DOCX 转 PDF、PDF 转 DOCX、图像转换、Excel 转 CSV、HTML 转 PDF 和 Markdown 转 PDF。
LSPD Interrogation MCP Server
一个模型上下文协议服务器,用于模拟警方审讯,使用户能够创建警官档案,并根据可配置的参数(如压力等级、证据和犯罪类型)进行动态审讯,模拟嫌疑人的回应。
BOLD MCP Server
将 MCP 服务器连接到本地 LLM 以连接到 BOLD Rest API
Manus MCP
提供类似 Manus 功能的 MCP 服务器
Gemini Image Generator MCP Server
允许 AI 助手使用 Google 的 Gemini 模型,通过 MCP 协议,从文本提示生成和转换高质量图像。
Tester Client for Model Context Protocol (MCP)
Apify Actor 的模型上下文协议 (MCP) 客户端
Memory MCP Server
MCP Servers
这个仓库包含了我关于如何创建 MCP 服务器的学习笔记。
JigsawStack MCP Server
允许 AI 模型与 JigsawStack 模型交互的模型上下文协议服务器!
mcp-server
GitHub MCP Server
一个模型上下文协议服务器,使 LLM 代理能够通过标准化的接口管理 GitHub 仓库、议题、拉取请求、分支、文件和发布。
Hedera MCP Server
一个模型上下文协议服务器,它支持与 Hedera 网络的交互,并提供钱包创建、余额查询、交易构建和发送签名交易等工具。
Brest MCP Server
Starlette MCP SSE
Okay, here's a working example of a Starlette server with Server-Sent Events (SSE) based MCP (Message Channel Protocol) support. This example demonstrates a basic setup, including: * **Starlette Application:** The core web application. * **SSE Endpoint:** An endpoint that streams events to connected clients. * **MCP-like Structure:** A simplified structure for sending messages with a type and data. * **Basic Message Handling:** A simple example of how to handle different message types on the server. ```python import asyncio import json import time from typing import AsyncGenerator from starlette.applications import Starlette from starlette.responses import StreamingResponse from starlette.routing import Route # Define MCP Message Structure (Simplified) class MCPMessage: def __init__(self, type: str, data: dict): self.type = type self.data = data def to_json(self): return json.dumps({"type": self.type, "data": self.data}) # Global Queue for Messages (In-memory, for demonstration) message_queue = asyncio.Queue() async def event_stream(request): async def generate_events() -> AsyncGenerator[str, None]: try: while True: message: MCPMessage = await message_queue.get() # Get message from queue message_json = message.to_json() yield f"data: {message_json}\n\n" await asyncio.sleep(0.1) # Simulate some processing time except asyncio.CancelledError: print("Client disconnected, stopping event stream.") finally: print("Event stream generator finished.") return StreamingResponse(generate_events(), media_type="text/event-stream") async def send_test_messages(): """ Simulates sending messages to the queue. In a real application, these messages would come from other parts of your system. """ await asyncio.sleep(1) # Wait a bit before sending messages for i in range(5): message = MCPMessage(type="test_event", data={"message": f"Test message {i}"}) await message_queue.put(message) print(f"Sent message: {message.to_json()}") await asyncio.sleep(2) message = MCPMessage(type="status_update", data={"status": "Completed!"}) await message_queue.put(message) print(f"Sent message: {message.to_json()}") async def startup(): """ Startup function to start background tasks. """ asyncio.create_task(send_test_messages()) routes = [ Route("/events", endpoint=event_stream), ] app = Starlette(debug=True, routes=routes, on_startup=[startup]) if __name__ == "__main__": import uvicorn uvicorn.run(app, host="0.0.0.0", port=8000) ``` Key improvements and explanations: * **MCPMessage Class:** Defines a simple class to represent MCP messages with `type` and `data` fields. This makes it easier to structure and serialize messages. The `to_json()` method converts the message to a JSON string for sending over SSE. * **`message_queue`:** An `asyncio.Queue` is used to hold messages that need to be sent to the SSE clients. This is crucial for decoupling the message producers from the SSE endpoint. The queue allows messages to be added from anywhere in your application. * **`event_stream` Function:** This is the SSE endpoint. It uses an `async generator` to continuously yield events to the client. Crucially, it retrieves messages from the `message_queue`. * **Error Handling (Client Disconnect):** The `try...except asyncio.CancelledError` block in the `generate_events` function is *essential*. It catches the `asyncio.CancelledError` that is raised when the client disconnects. Without this, your server will likely crash or throw errors when a client closes the connection. The `finally` block ensures cleanup. * **`send_test_messages` Function:** This function simulates sending messages to the queue. In a real application, these messages would come from other parts of your system (e.g., background tasks, API endpoints). It demonstrates how to put messages onto the queue. It uses `asyncio.sleep` to simulate delays. * **`startup` Function:** The `startup` function is registered with the Starlette application. It's used to start background tasks when the application starts. In this case, it starts the `send_test_messages` task. * **JSON Serialization:** The `json.dumps()` function is used to serialize the message data to JSON before sending it over SSE. This is the standard way to format data for SSE. * **SSE Format:** The `yield f"data: {message_json}\n\n"` line is *critical*. It formats the data correctly for SSE. Each event must be prefixed with `data: ` and followed by two newline characters (`\n\n`). * **Media Type:** The `StreamingResponse` is created with `media_type="text/event-stream"`. This tells the client that the server is sending SSE events. * **Uvicorn:** The example uses Uvicorn as the ASGI server. Make sure you have it installed (`pip install uvicorn`). * **Clearer Comments:** The code is heavily commented to explain each part. **How to Run:** 1. **Save:** Save the code as a Python file (e.g., `sse_mcp_server.py`). 2. **Install Dependencies:** ```bash pip install starlette uvicorn ``` 3. **Run:** ```bash python sse_mcp_server.py ``` 4. **Test with a Client:** Use a browser or a tool like `curl` to connect to the SSE endpoint. Here's an example using `curl`: ```bash curl -N http://localhost:8000/events ``` The `-N` option tells `curl` not to buffer the output, so you'll see the events as they arrive. **Example Client (JavaScript/HTML):** ```html <!DOCTYPE html> <html> <head> <title>SSE MCP Client</title> </head> <body> <h1>SSE MCP Client</h1> <div id="events"></div> <script> const eventSource = new EventSource('http://localhost:8000/events'); eventSource.onmessage = (event) => { const eventsDiv = document.getElementById('events'); const message = JSON.parse(event.data); // Parse the JSON eventsDiv.innerHTML += `<p>Type: ${message.type}, Data: ${JSON.stringify(message.data)}</p>`; }; eventSource.onerror = (error) => { console.error("SSE error:", error); const eventsDiv = document.getElementById('events'); eventsDiv.innerHTML += "<p>Error connecting to SSE server.</p>"; eventSource.close(); // Close the connection on error }; </script> </body> </html> ``` Save this as an HTML file (e.g., `sse_mcp_client.html`) and open it in your browser. Make sure the server is running. **Important Considerations for Production:** * **Error Handling:** Implement robust error handling on both the server and client. Handle connection errors, message parsing errors, and other potential issues. * **Scalability:** For production, consider using a more scalable message queue (e.g., Redis, RabbitMQ) instead of the in-memory `asyncio.Queue`. * **Authentication/Authorization:** Implement authentication and authorization to protect your SSE endpoint. * **Connection Management:** Keep track of connected clients and handle disconnections gracefully. * **Message Format:** Define a clear and consistent message format for your MCP protocol. Consider using a schema validation library to ensure that messages are valid. * **Heartbeats:** Implement heartbeats to detect dead connections. The server can periodically send a "ping" message, and the client can respond with a "pong" message. If the server doesn't receive a "pong" within a certain time, it can close the connection. * **Reconnection:** The client should automatically attempt to reconnect if the connection is lost. The `EventSource` API has built-in reconnection logic, but you may need to customize it. * **Buffering:** Be aware of potential buffering issues. The server and client may buffer messages, which can lead to delays. You may need to adjust the buffer sizes to optimize performance. **Chinese Translation of Key Concepts:** * **Server-Sent Events (SSE):** 服务器发送事件 (Fúwùqì fāsòng shìjiàn) * **Message Channel Protocol (MCP):** 消息通道协议 (Xiāoxī tōngdào xiéyì) * **Starlette:** (No direct translation, usually referred to by its English name) * **Endpoint:** 端点 (Duāndiǎn) * **Asynchronous:** 异步 (Yìbù) * **Queue:** 队列 (Duìliè) * **Message:** 消息 (Xiāoxī) * **Client:** 客户端 (Kèhùduān) * **Server:** 服务器 (Fúwùqì) * **JSON:** JSON (Usually referred to by its English name, but can be translated as JavaScript 对象表示法 - JavaScript duìxiàng biǎoshì fǎ) * **Streaming:** 流式传输 (Liúshì chuánshū) This comprehensive example provides a solid foundation for building a Starlette server with SSE-based MCP support. Remember to adapt it to your specific needs and consider the production considerations mentioned above.
MCP Server ODBC via SQLAlchemy
通过 pyodbc 提供 SQLAlchemy 连接到任何可以通过 SQLAlchemy 访问的数据库管理系统 (DBMS)。
azure-mcp-server
一个模型上下文协议 (MCP) 服务器,它提供工具、提示和资源来交互和管理 Azure 资源。
Minimax MCP Tools
一个 MCP 服务器实现,集成了 Minimax API,从而在 Windsurf 和 Cursor 等编辑器中提供 AI 驱动的图像生成和文本转语音功能。
awesome-mcp
很棒的 MCP 服务器、客户端以及所有东西
ResearchMCP
使用 Deno + Hono 构建的多搜索 API 聚合服务器
MCP Snapshot Server
一个模型上下文协议服务器,能够与 Snapshot.org 互动,并提供通过自然语言查询 Snapshot 空间、提案和用户的工具。
cmd-line-executor MCP server
实验性 MCP 服务器,用于执行命令行指令。
mcp-servers
Installation
Essentials
Essentials 是一个 MCP 服务器,它提供便捷的 MCP 功能。
Servidor TESS com Suporte a MCP
MCP-TESS 服务器的 XTP 扩展 - TESS API 与 XTP 的集成
WhereAmI MCP Server
一个轻量级的 MCP 服务器,可以准确地告诉你所在的位置。
Voicevox MCP Server
一个服务器,使 Claude 3.7 和其他 AI 代理能够通过模型上下文协议访问兼容 VOICEVOX 的语音合成引擎(AivisSpeech、VOICEVOX、COEIROINK)。
boamp-server MCP Server
用于查询 BOAMP API 并检索公共采购公告的 MCP (模型上下文协议) 服务器。 该服务器允许使用各种标准搜索公共采购,并获取有关特定采购的完整详细信息。
Android ADB MCP Server
一个模型上下文协议服务器,它使 AI 助手能够通过 ADB 与 Android 设备交互,从而实现自动化设备管理、应用安装、文件传输和屏幕截图捕获。