Discover Awesome MCP Servers

Extend your agent with 12,173 capabilities via MCP servers.

All12,173
MCP Database Server

MCP Database Server

一个模型上下文协议服务器,它使大型语言模型(LLM)能够通过自然语言与数据库(目前是 MongoDB)进行交互,支持诸如查询、插入、删除文档以及运行聚合管道等操作。

Divide and Conquer MCP Server

Divide and Conquer MCP Server

使 AI 代理能够使用结构化的 JSON 格式将复杂任务分解为可管理的部分,并具有任务跟踪、上下文保持和进度监控功能。

JigsawStack MCP Server

JigsawStack MCP Server

允许 AI 模型与 JigsawStack 模型交互的模型上下文协议服务器!

mcp-server

mcp-server

GitHub MCP Server

GitHub MCP Server

一个模型上下文协议服务器,使 LLM 代理能够通过标准化的接口管理 GitHub 仓库、议题、拉取请求、分支、文件和发布。

BOLD MCP Server

BOLD MCP Server

将 MCP 服务器连接到本地 LLM 以连接到 BOLD Rest API

Manus MCP

Manus MCP

提供类似 Manus 功能的 MCP 服务器

Gemini Image Generator MCP Server

Gemini Image Generator MCP Server

允许 AI 助手使用 Google 的 Gemini 模型,通过 MCP 协议,从文本提示生成和转换高质量图像。

Brest MCP  Server

Brest MCP Server

MCP LLM Bridge

MCP LLM Bridge

实现MCP,以支持MCP服务器和兼容OpenAI的LLM之间的通信。

Tester Client for Model Context Protocol (MCP)

Tester Client for Model Context Protocol (MCP)

Apify Actor 的模型上下文协议 (MCP) 客户端

Memory MCP Server

Memory MCP Server

File Converter MCP Server

File Converter MCP Server

一个 MCP 服务器,为 AI 代理提供多种文件转换工具,支持各种文档和图像格式转换,包括 DOCX 转 PDF、PDF 转 DOCX、图像转换、Excel 转 CSV、HTML 转 PDF 和 Markdown 转 PDF。

LSPD Interrogation MCP Server

LSPD Interrogation MCP Server

一个模型上下文协议服务器,用于模拟警方审讯,使用户能够创建警官档案,并根据可配置的参数(如压力等级、证据和犯罪类型)进行动态审讯,模拟嫌疑人的回应。

mcp-server-collector MCP server

mcp-server-collector MCP server

一个 MCP 服务器,用于收集互联网上的 MCP 服务器。

MCP Go SDK

MCP Go SDK

好的,以下是将“Build Model Context Protocol (MCP) servers in Go”翻译成中文的几种选择,并附带一些解释: **1. 最直接的翻译:** * **在 Go 中构建模型上下文协议 (MCP) 服务器** * 这是最字面的翻译,保留了所有技术术语。适合对 MCP 和 Go 已经有一定了解的读者。 **2. 更易理解的翻译:** * **使用 Go 语言开发模型上下文协议 (MCP) 服务器** * 这种翻译更强调了“开发”这个动作,使句子更流畅。 **3. 针对特定受众的翻译:** * **用 Go 实现模型上下文协议 (MCP) 服务器** * 如果目标读者是程序员,这种翻译更简洁,更强调“实现”这个技术细节。 **4. 解释性翻译 (如果需要更详细的说明):** * **使用 Go 语言构建符合模型上下文协议 (MCP) 的服务器** * 这种翻译明确指出服务器需要符合 MCP 协议。 **选择哪个翻译取决于你的目标受众和语境。** 一般来说,第一个或第二个翻译就足够了。 如果你需要更强调技术细节,可以选择第三个。 如果你需要更明确地说明服务器需要符合 MCP 协议,可以选择第四个。 **总结:** 我推荐使用以下两种翻译之一: * **在 Go 中构建模型上下文协议 (MCP) 服务器** * **使用 Go 语言开发模型上下文协议 (MCP) 服务器** 希望这些信息对您有所帮助!

MCP Servers

MCP Servers

这个仓库包含了我关于如何创建 MCP 服务器的学习笔记。

cmd-line-executor MCP server

cmd-line-executor MCP server

实验性 MCP 服务器,用于执行命令行指令。

mcp-servers

mcp-servers

Installation

Installation

Essentials

Essentials

Essentials 是一个 MCP 服务器,它提供便捷的 MCP 功能。

Servidor TESS com Suporte a MCP

Servidor TESS com Suporte a MCP

MCP-TESS 服务器的 XTP 扩展 - TESS API 与 XTP 的集成

WhereAmI MCP Server

WhereAmI MCP Server

一个轻量级的 MCP 服务器,可以准确地告诉你所在的位置。

Voicevox MCP Server

Voicevox MCP Server

一个服务器,使 Claude 3.7 和其他 AI 代理能够通过模型上下文协议访问兼容 VOICEVOX 的语音合成引擎(AivisSpeech、VOICEVOX、COEIROINK)。

Starlette MCP SSE

Starlette MCP SSE

Okay, here's a working example of a Starlette server with Server-Sent Events (SSE) based MCP (Message Channel Protocol) support. This example demonstrates a basic setup, including: * **Starlette Application:** The core web application. * **SSE Endpoint:** An endpoint that streams events to connected clients. * **MCP-like Structure:** A simplified structure for sending messages with a type and data. * **Basic Message Handling:** A simple example of how to handle different message types on the server. ```python import asyncio import json import time from typing import AsyncGenerator from starlette.applications import Starlette from starlette.responses import StreamingResponse from starlette.routing import Route # Define MCP Message Structure (Simplified) class MCPMessage: def __init__(self, type: str, data: dict): self.type = type self.data = data def to_json(self): return json.dumps({"type": self.type, "data": self.data}) # Global Queue for Messages (In-memory, for demonstration) message_queue = asyncio.Queue() async def event_stream(request): async def generate_events() -> AsyncGenerator[str, None]: try: while True: message: MCPMessage = await message_queue.get() # Get message from queue message_json = message.to_json() yield f"data: {message_json}\n\n" await asyncio.sleep(0.1) # Simulate some processing time except asyncio.CancelledError: print("Client disconnected, stopping event stream.") finally: print("Event stream generator finished.") return StreamingResponse(generate_events(), media_type="text/event-stream") async def send_test_messages(): """ Simulates sending messages to the queue. In a real application, these messages would come from other parts of your system. """ await asyncio.sleep(1) # Wait a bit before sending messages for i in range(5): message = MCPMessage(type="test_event", data={"message": f"Test message {i}"}) await message_queue.put(message) print(f"Sent message: {message.to_json()}") await asyncio.sleep(2) message = MCPMessage(type="status_update", data={"status": "Completed!"}) await message_queue.put(message) print(f"Sent message: {message.to_json()}") async def startup(): """ Startup function to start background tasks. """ asyncio.create_task(send_test_messages()) routes = [ Route("/events", endpoint=event_stream), ] app = Starlette(debug=True, routes=routes, on_startup=[startup]) if __name__ == "__main__": import uvicorn uvicorn.run(app, host="0.0.0.0", port=8000) ``` Key improvements and explanations: * **MCPMessage Class:** Defines a simple class to represent MCP messages with `type` and `data` fields. This makes it easier to structure and serialize messages. The `to_json()` method converts the message to a JSON string for sending over SSE. * **`message_queue`:** An `asyncio.Queue` is used to hold messages that need to be sent to the SSE clients. This is crucial for decoupling the message producers from the SSE endpoint. The queue allows messages to be added from anywhere in your application. * **`event_stream` Function:** This is the SSE endpoint. It uses an `async generator` to continuously yield events to the client. Crucially, it retrieves messages from the `message_queue`. * **Error Handling (Client Disconnect):** The `try...except asyncio.CancelledError` block in the `generate_events` function is *essential*. It catches the `asyncio.CancelledError` that is raised when the client disconnects. Without this, your server will likely crash or throw errors when a client closes the connection. The `finally` block ensures cleanup. * **`send_test_messages` Function:** This function simulates sending messages to the queue. In a real application, these messages would come from other parts of your system (e.g., background tasks, API endpoints). It demonstrates how to put messages onto the queue. It uses `asyncio.sleep` to simulate delays. * **`startup` Function:** The `startup` function is registered with the Starlette application. It's used to start background tasks when the application starts. In this case, it starts the `send_test_messages` task. * **JSON Serialization:** The `json.dumps()` function is used to serialize the message data to JSON before sending it over SSE. This is the standard way to format data for SSE. * **SSE Format:** The `yield f"data: {message_json}\n\n"` line is *critical*. It formats the data correctly for SSE. Each event must be prefixed with `data: ` and followed by two newline characters (`\n\n`). * **Media Type:** The `StreamingResponse` is created with `media_type="text/event-stream"`. This tells the client that the server is sending SSE events. * **Uvicorn:** The example uses Uvicorn as the ASGI server. Make sure you have it installed (`pip install uvicorn`). * **Clearer Comments:** The code is heavily commented to explain each part. **How to Run:** 1. **Save:** Save the code as a Python file (e.g., `sse_mcp_server.py`). 2. **Install Dependencies:** ```bash pip install starlette uvicorn ``` 3. **Run:** ```bash python sse_mcp_server.py ``` 4. **Test with a Client:** Use a browser or a tool like `curl` to connect to the SSE endpoint. Here's an example using `curl`: ```bash curl -N http://localhost:8000/events ``` The `-N` option tells `curl` not to buffer the output, so you'll see the events as they arrive. **Example Client (JavaScript/HTML):** ```html <!DOCTYPE html> <html> <head> <title>SSE MCP Client</title> </head> <body> <h1>SSE MCP Client</h1> <div id="events"></div> <script> const eventSource = new EventSource('http://localhost:8000/events'); eventSource.onmessage = (event) => { const eventsDiv = document.getElementById('events'); const message = JSON.parse(event.data); // Parse the JSON eventsDiv.innerHTML += `<p>Type: ${message.type}, Data: ${JSON.stringify(message.data)}</p>`; }; eventSource.onerror = (error) => { console.error("SSE error:", error); const eventsDiv = document.getElementById('events'); eventsDiv.innerHTML += "<p>Error connecting to SSE server.</p>"; eventSource.close(); // Close the connection on error }; </script> </body> </html> ``` Save this as an HTML file (e.g., `sse_mcp_client.html`) and open it in your browser. Make sure the server is running. **Important Considerations for Production:** * **Error Handling:** Implement robust error handling on both the server and client. Handle connection errors, message parsing errors, and other potential issues. * **Scalability:** For production, consider using a more scalable message queue (e.g., Redis, RabbitMQ) instead of the in-memory `asyncio.Queue`. * **Authentication/Authorization:** Implement authentication and authorization to protect your SSE endpoint. * **Connection Management:** Keep track of connected clients and handle disconnections gracefully. * **Message Format:** Define a clear and consistent message format for your MCP protocol. Consider using a schema validation library to ensure that messages are valid. * **Heartbeats:** Implement heartbeats to detect dead connections. The server can periodically send a "ping" message, and the client can respond with a "pong" message. If the server doesn't receive a "pong" within a certain time, it can close the connection. * **Reconnection:** The client should automatically attempt to reconnect if the connection is lost. The `EventSource` API has built-in reconnection logic, but you may need to customize it. * **Buffering:** Be aware of potential buffering issues. The server and client may buffer messages, which can lead to delays. You may need to adjust the buffer sizes to optimize performance. **Chinese Translation of Key Concepts:** * **Server-Sent Events (SSE):** 服务器发送事件 (Fúwùqì fāsòng shìjiàn) * **Message Channel Protocol (MCP):** 消息通道协议 (Xiāoxī tōngdào xiéyì) * **Starlette:** (No direct translation, usually referred to by its English name) * **Endpoint:** 端点 (Duāndiǎn) * **Asynchronous:** 异步 (Yìbù) * **Queue:** 队列 (Duìliè) * **Message:** 消息 (Xiāoxī) * **Client:** 客户端 (Kèhùduān) * **Server:** 服务器 (Fúwùqì) * **JSON:** JSON (Usually referred to by its English name, but can be translated as JavaScript 对象表示法 - JavaScript duìxiàng biǎoshì fǎ) * **Streaming:** 流式传输 (Liúshì chuánshū) This comprehensive example provides a solid foundation for building a Starlette server with SSE-based MCP support. Remember to adapt it to your specific needs and consider the production considerations mentioned above.

MCP Server ODBC via SQLAlchemy

MCP Server ODBC via SQLAlchemy

通过 pyodbc 提供 SQLAlchemy 连接到任何可以通过 SQLAlchemy 访问的数据库管理系统 (DBMS)。

awesome-mcp

awesome-mcp

很棒的 MCP 服务器、客户端以及所有东西

Minimax MCP Tools

Minimax MCP Tools

一个 MCP 服务器实现,集成了 Minimax API,从而在 Windsurf 和 Cursor 等编辑器中提供 AI 驱动的图像生成和文本转语音功能。

ResearchMCP

ResearchMCP

使用 Deno + Hono 构建的多搜索 API 聚合服务器

MCP Snapshot Server

MCP Snapshot Server

一个模型上下文协议服务器,能够与 Snapshot.org 互动,并提供通过自然语言查询 Snapshot 空间、提案和用户的工具。