Discover Awesome MCP Servers

Extend your agent with 15,300 capabilities via MCP servers.

All15,300
mcp-server-python

mcp-server-python

鏡 (Kagami)

IntelliPlan

IntelliPlan

IntelliPlan

Bitso MCP Server

Bitso MCP Server

Enables interaction with the Bitso cryptocurrency exchange API to access withdrawals and fundings data. Provides comprehensive tools for listing, filtering, and retrieving withdrawal and funding transactions with proper authentication and error handling.

Fortune MCP Server

Fortune MCP Server

Enables users to perform tarot card readings and generate horoscopes based on specified dates, times, and locations. Provides mystical divination services through tarot draws and astrological calculations.

USolver

USolver

A best-effort universal logic and numerical solver interface using MCP that implements the 'LLM sandwich' model to process queries, call dedicated solvers (ortools, cvxpy, z3), and verbalize results.

onx-mcp-server

onx-mcp-server

MCPサーバーを構築して、モデルコンテキストプロトコルを検証する。

Salesforce DX MCP Server

Salesforce DX MCP Server

Enables secure interaction with Salesforce orgs through LLMs, providing tools for managing orgs, querying data, deploying metadata, running tests, and performing code analysis. Features granular access control and uses encrypted auth files to avoid exposing secrets in plain text.

Google Cloud DNS API MCP Server

Google Cloud DNS API MCP Server

Auto-generated MCP server that enables interaction with Google's Cloud DNS API for managing DNS zones and records through natural language.

ScreenshotOne MCP Server

ScreenshotOne MCP Server

A simple implementation of an MCP server for the ScreenshotOne API

MCP Vulnerability Management System

MCP Vulnerability Management System

組織がセキュリティ脆弱性を効果的に追跡、管理、対応できるように支援する包括的なシステム。脆弱性の追跡、ユーザー管理、サポートチケット、APIキー管理、SSL証明書管理などの機能を備えています。

mcp_stdio2sse

mcp_stdio2sse

Stdio MCP サーバーの SSE バージョン

HDFS MCP Server

HDFS MCP Server

A Model Context Protocol server that enables interaction with Hadoop Distributed File System, allowing operations like listing, reading, writing, and managing HDFS files and directories.

AskTheApi Team Builder

AskTheApi Team Builder

OpenAPI API と通信するためのエージェントネットワークビルダー。AutoGen をベースにしています。

MCP Server Implementation Guide

MCP Server Implementation Guide

## Cursor統合のための独自のMCP (Model Control Protocol) サーバーの作成ガイドと実装 This document provides a guide and implementation details for creating your own MCP (Model Control Protocol) server to integrate with Cursor. **概要:** このドキュメントでは、Cursorと統合するための独自のMCP (Model Control Protocol) サーバーを作成するためのガイドと実装の詳細を提供します。 **1. What is MCP? (MCPとは?)** MCP is a protocol used by Cursor to communicate with language models. It allows Cursor to send requests to a model, such as code completion, code generation, and code editing, and receive responses. MCPは、Cursorが言語モデルと通信するために使用するプロトコルです。 これにより、Cursorはコード補完、コード生成、コード編集などのリクエストをモデルに送信し、レスポンスを受信できます。 **2. Why Create Your Own MCP Server? (独自のMCPサーバーを作成する理由?)** * **Use a custom model:** Integrate a model that is not natively supported by Cursor. * **Control model access:** Manage access to your model and implement authentication. * **Customize model behavior:** Modify the model's behavior and tailor it to your specific needs. * **Experiment with new features:** Develop and test new features for Cursor integration. * **カスタムモデルの使用:** Cursorがネイティブにサポートしていないモデルを統合します。 * **モデルアクセスの制御:** モデルへのアクセスを管理し、認証を実装します。 * **モデルの動作のカスタマイズ:** モデルの動作を変更し、特定のニーズに合わせて調整します。 * **新機能の実験:** Cursor統合のための新機能を開発およびテストします。 **3. Key Components (主要なコンポーネント)** * **MCP Server:** The server that listens for requests from Cursor and sends responses. * **Model Interface:** The interface that connects the MCP server to your language model. * **Request Handling:** The logic that processes requests from Cursor and generates responses. * **Data Serialization/Deserialization:** The process of converting data between the MCP protocol and the format used by your language model. * **MCPサーバー:** Cursorからのリクエストをリッスンし、レスポンスを送信するサーバー。 * **モデルインターフェース:** MCPサーバーを言語モデルに接続するインターフェース。 * **リクエスト処理:** Cursorからのリクエストを処理し、レスポンスを生成するロジック。 * **データのシリアライズ/デシリアライズ:** MCPプロトコルと、言語モデルで使用される形式の間でデータを変換するプロセス。 **4. Implementation Steps (実装手順)** Here's a general outline of the steps involved in creating your own MCP server: 1. **Choose a programming language and framework:** Python with Flask or FastAPI is a popular choice. 2. **Define the MCP protocol:** Understand the structure of requests and responses. Refer to the Cursor documentation for details. 3. **Implement the MCP server:** Create a server that listens for incoming requests on a specific port. 4. **Implement the model interface:** Connect the server to your language model. This might involve using an API or a local model. 5. **Implement request handling:** Process the requests from Cursor and generate appropriate responses using your language model. 6. **Implement data serialization/deserialization:** Convert data between the MCP protocol and the format used by your language model. JSON is a common choice. 7. **Test the server:** Send requests to the server and verify that it returns the correct responses. 8. **Configure Cursor:** Configure Cursor to use your MCP server. 独自のMCPサーバーを作成するための一般的な手順の概要を以下に示します。 1. **プログラミング言語とフレームワークの選択:** FlaskまたはFastAPIを使用したPythonが一般的な選択肢です。 2. **MCPプロトコルの定義:** リクエストとレスポンスの構造を理解します。 詳細については、Cursorのドキュメントを参照してください。 3. **MCPサーバーの実装:** 特定のポートで受信リクエストをリッスンするサーバーを作成します。 4. **モデルインターフェースの実装:** サーバーを言語モデルに接続します。 これには、APIまたはローカルモデルの使用が含まれる場合があります。 5. **リクエスト処理の実装:** Cursorからのリクエストを処理し、言語モデルを使用して適切なレスポンスを生成します。 6. **データのシリアライズ/デシリアライズの実装:** MCPプロトコルと、言語モデルで使用される形式の間でデータを変換します。 JSONが一般的な選択肢です。 7. **サーバーのテスト:** サーバーにリクエストを送信し、正しいレスポンスが返されることを確認します。 8. **Cursorの構成:** MCPサーバーを使用するようにCursorを構成します。 **5. Example Implementation (Python with Flask) (実装例 (Flaskを使用したPython))** This is a simplified example to illustrate the basic structure. You'll need to adapt it to your specific model and requirements. ```python from flask import Flask, request, jsonify import json app = Flask(__name__) # Replace with your actual model loading and inference logic def generate_completion(prompt): # This is a placeholder - replace with your model's code return f"// Completion for: {prompt}" @app.route('/completion', methods=['POST']) def completion(): data = request.get_json() prompt = data.get('prompt') if not prompt: return jsonify({'error': 'Missing prompt'}), 400 completion_text = generate_completion(prompt) response = { 'completion': completion_text, 'metadata': {'model': 'MyCustomModel'} } return jsonify(response) if __name__ == '__main__': app.run(debug=True, port=5000) ``` **Explanation:** * **Flask:** A lightweight web framework for creating the server. * **`/completion` endpoint:** Handles completion requests from Cursor. * **`generate_completion` function:** A placeholder for your model's inference logic. This is where you'll integrate your language model. * **JSON serialization/deserialization:** Uses `jsonify` to convert Python dictionaries to JSON and `request.get_json()` to parse JSON requests. * **Error handling:** Includes basic error handling for missing prompts. **解説:** * **Flask:** サーバーを作成するための軽量なWebフレームワーク。 * **`/completion` エンドポイント:** Cursorからの補完リクエストを処理します。 * **`generate_completion` 関数:** モデルの推論ロジックのプレースホルダー。 ここで言語モデルを統合します。 * **JSONシリアライズ/デシリアライズ:** `jsonify`を使用してPythonディクショナリをJSONに変換し、`request.get_json()`を使用してJSONリクエストを解析します。 * **エラー処理:** プロンプトの欠落に対する基本的なエラー処理が含まれています。 **6. Configuring Cursor (Cursorの構成)** In Cursor, you'll need to configure the "Model Control Protocol" settings to point to your server. This typically involves specifying the server's address (e.g., `http://localhost:5000`) and any necessary authentication credentials. Refer to the Cursor documentation for the exact steps. Cursorでは、サーバーを指すように「Model Control Protocol」設定を構成する必要があります。 これには通常、サーバーのアドレス(例:`http://localhost:5000`)と必要な認証資格情報の指定が含まれます。 正確な手順については、Cursorのドキュメントを参照してください。 **7. Important Considerations (重要な考慮事項)** * **Security:** Implement proper authentication and authorization to protect your model. * **Performance:** Optimize your model and server for performance to ensure a smooth user experience. * **Error Handling:** Implement robust error handling to gracefully handle unexpected errors. * **Scalability:** Consider the scalability of your server if you expect a large number of users. * **MCP Protocol Compliance:** Ensure your server fully complies with the MCP protocol specification. * **セキュリティ:** モデルを保護するために、適切な認証と認可を実装します。 * **パフォーマンス:** スムーズなユーザーエクスペリエンスを確保するために、モデルとサーバーのパフォーマンスを最適化します。 * **エラー処理:** 予期しないエラーを適切に処理するために、堅牢なエラー処理を実装します。 * **スケーラビリティ:** 大量のユーザーを想定する場合は、サーバーのスケーラビリティを検討してください。 * **MCPプロトコルの準拠:** サーバーがMCPプロトコルの仕様に完全に準拠していることを確認してください。 **8. Further Resources (参考資料)** * **Cursor Documentation:** The official Cursor documentation is the best source of information about the MCP protocol. * **Flask/FastAPI Documentation:** Refer to the documentation for your chosen web framework. * **Language Model Documentation:** Refer to the documentation for your language model. * **Cursorドキュメント:** 公式のCursorドキュメントは、MCPプロトコルに関する最適な情報源です。 * **Flask/FastAPIドキュメント:** 選択したWebフレームワークのドキュメントを参照してください。 * **言語モデルのドキュメント:** 言語モデルのドキュメントを参照してください。 This guide provides a starting point for creating your own MCP server for Cursor integration. Remember to adapt the example code and implementation details to your specific needs and requirements. Good luck! このガイドは、Cursor統合のための独自のMCPサーバーを作成するための出発点を提供します。 例のコードと実装の詳細を、特定のニーズと要件に合わせて調整することを忘れないでください。 頑張ってください!

DeepChat 好用的图像 MCP Server 集合

DeepChat 好用的图像 MCP Server 集合

DeepChat用の画像MCPサーバー

Ghost MCP Server

Ghost MCP Server

Manage your Ghost blog content directly from Claude, Cursor, or any MCP-compatible client, allowing you to create, edit, search, and delete posts with support for tag management and analytics.

Saros MCP Server

Saros MCP Server

Enables AI agents to interact with Saros DeFi through natural language, providing tools for liquidity pool management, portfolio analytics, farming positions, and swap quotes on Solana.

amap-weather-server

amap-weather-server

MCP を使用した amap-weather サーバー

Slack MCP Server

Slack MCP Server

A FastMCP-based server that provides complete Slack integration for Cursor IDE, allowing users to interact with Slack API features using natural language.

Migadu MCP Server

Migadu MCP Server

Enables AI assistants to manage Migadu email hosting services through natural language, including creating mailboxes, setting up aliases, configuring autoresponders, and handling bulk operations efficiently.

HiveFlow MCP Server

HiveFlow MCP Server

Connects AI assistants (Claude, Cursor, etc.) directly to the HiveFlow automation platform, allowing them to create, manage, and execute automation flows through natural language commands.

Jokes MCP Server

Jokes MCP Server

An MCP server that allows Microsoft Copilot Studio to fetch random jokes from three sources: Chuck Norris jokes, Dad jokes, and Yo Mama jokes.

MCP-Server-MySSL

MCP-Server-MySSL

MySSL MCPサーバー (MySSL MCP sābā)

mcp_server

mcp_server

Okay, I understand. You want guidance on implementing a sample `mcp_server` that can interact with a `dolphin_mcp` client. Here's a breakdown of the steps involved, along with code snippets and explanations to get you started. This will be a simplified example, focusing on the core concepts. **1. Understanding the MCP Protocol (Simplified)** * **MCP (Microcontroller Protocol):** A lightweight protocol for communication between a host (your `dolphin_mcp` client) and a microcontroller (your `mcp_server`). It's often used for things like reading sensor data, controlling actuators, and configuring settings. * **Request-Response:** The client sends a request to the server, and the server sends back a response. * **Commands/Opcodes:** Requests are identified by a command code (or opcode). Each command tells the server what action to perform. * **Data:** Requests and responses can include data. For example, a request to set a motor speed might include the desired speed value. A response to a sensor read request would include the sensor reading. * **Framing:** The protocol needs a way to know where a message starts and ends. Common methods include: * **Length-Prefixed:** The message starts with a byte or two indicating the total length of the message. * **Start/End Markers:** Special characters (e.g., `0x02` for start, `0x03` for end) are used to delimit messages. * **Fixed Length:** All messages are the same length. **2. Choosing a Communication Method** The `mcp_server` needs to communicate with the `dolphin_mcp` client. Common options include: * **Serial (UART):** Simple, widely supported, good for basic communication. Often used with microcontrollers. * **TCP/IP (Sockets):** More complex, but allows communication over a network. Suitable if the client and server are on different machines. * **USB:** Can be used for higher bandwidth communication. For this example, I'll focus on **Serial (UART)** because it's the most common and easiest to demonstrate. **3. Server Implementation (Python Example)** Here's a Python example using the `pyserial` library to simulate an `mcp_server` communicating over a serial port. This is a *simulation* because you'd typically run this code on a microcontroller. ```python import serial import time # Configuration SERIAL_PORT = 'COM3' # Replace with your serial port BAUD_RATE = 115200 # MCP Command Opcodes (Example) CMD_GET_SENSOR_DATA = 0x01 CMD_SET_LED_STATE = 0x02 CMD_GET_VERSION = 0x03 # MCP Response Codes RESPONSE_OK = 0x00 RESPONSE_ERROR = 0x01 def process_request(request): """Processes an MCP request and returns a response.""" opcode = request[0] # First byte is the opcode if opcode == CMD_GET_SENSOR_DATA: # Simulate reading sensor data sensor_value = 42 # Replace with actual sensor reading response = bytearray([RESPONSE_OK, sensor_value]) return response elif opcode == CMD_SET_LED_STATE: led_state = request[1] # Second byte is the LED state (0 or 1) if led_state == 0: print("LED OFF") elif led_state == 1: print("LED ON") else: return bytearray([RESPONSE_ERROR]) # Invalid LED state response = bytearray([RESPONSE_OK]) return response elif opcode == CMD_GET_VERSION: version = 123 #Simulate version number response = bytearray([RESPONSE_OK, version]) return response else: # Unknown command return bytearray([RESPONSE_ERROR]) def main(): try: ser = serial.Serial(SERIAL_PORT, BAUD_RATE) print(f"Connected to {SERIAL_PORT}") while True: if ser.in_waiting > 0: request = ser.read(ser.in_waiting) # Read all available bytes print(f"Received request: {request.hex()}") response = process_request(request) print(f"Sending response: {response.hex()}") ser.write(response) time.sleep(0.01) # Small delay to avoid busy-waiting except serial.SerialException as e: print(f"Error: {e}") finally: if 'ser' in locals() and ser.is_open: ser.close() print("Serial port closed.") if __name__ == "__main__": main() ``` **Explanation:** * **`serial`:** Imports the `pyserial` library for serial communication. Install it with `pip install pyserial`. * **`SERIAL_PORT` and `BAUD_RATE`:** Configure the serial port and baud rate. **Important:** Make sure these match the settings used by your `dolphin_mcp` client. You'll need to change `COM3` to the correct port on your system. * **`CMD_*` constants:** Define command opcodes. These are just examples; you'll need to define the opcodes that your `dolphin_mcp` client uses. * **`RESPONSE_*` constants:** Define response codes. * **`process_request(request)`:** This function is the heart of the server. It takes the raw request data, parses the opcode, and performs the appropriate action. It then constructs a response and returns it. * **`main()`:** * Opens the serial port. * Enters a loop that continuously checks for incoming data. * If data is available, it reads the request, processes it, and sends the response. * Includes error handling to catch serial port exceptions. * Closes the serial port when the program exits. **4. Dolphin-MCP Client (Example)** I don't have specific details about your `dolphin_mcp` client, but here's a general example of how you might use it to interact with the server above. This assumes your `dolphin_mcp` library has functions for sending requests and receiving responses. ```python # Assuming you have a dolphin_mcp library # import dolphin_mcp # Replace with the actual import # Example usage (replace with your actual dolphin_mcp library calls) def send_get_sensor_data_request(): # Construct the request (opcode only in this case) request = bytearray([CMD_GET_SENSOR_DATA]) # Send the request using dolphin_mcp (replace with actual function) # response = dolphin_mcp.send_request(request) # Simulate sending and receiving print(f"Sending request: {request.hex()}") # Simulate response from server response = bytearray([RESPONSE_OK, 42]) #Simulate response print(f"Received response: {response.hex()}") if response[0] == RESPONSE_OK: sensor_value = response[1] print(f"Sensor value: {sensor_value}") else: print("Error getting sensor data") def send_set_led_state_request(led_state): # Construct the request (opcode and LED state) request = bytearray([CMD_SET_LED_STATE, led_state]) # Send the request using dolphin_mcp (replace with actual function) # response = dolphin_mcp.send_request(request) # Simulate sending and receiving print(f"Sending request: {request.hex()}") response = bytearray([RESPONSE_OK]) #Simulate response print(f"Received response: {response.hex()}") if response[0] == RESPONSE_OK: print("LED state set successfully") else: print("Error setting LED state") def send_get_version_request(): # Construct the request (opcode only in this case) request = bytearray([CMD_GET_VERSION]) # Send the request using dolphin_mcp (replace with actual function) # response = dolphin_mcp.send_request(request) # Simulate sending and receiving print(f"Sending request: {request.hex()}") response = bytearray([RESPONSE_OK, 123]) #Simulate response print(f"Received response: {response.hex()}") if response[0] == RESPONSE_OK: version = response[1] print(f"Version: {version}") else: print("Error getting version") # Example usage send_get_sensor_data_request() send_set_led_state_request(1) # Turn LED on send_set_led_state_request(0) # Turn LED off send_get_version_request() ``` **Important Notes and Next Steps:** 1. **Replace Placeholders:** The `dolphin_mcp` client code is heavily commented with placeholders. You *must* replace these with the actual functions and methods provided by your `dolphin_mcp` library. Consult the library's documentation. 2. **Error Handling:** The examples include basic error handling, but you should add more robust error checking and reporting. 3. **Data Types:** Pay close attention to data types. Make sure the data you're sending and receiving is in the correct format (e.g., integers, floats, strings). Use `struct.pack` and `struct.unpack` in Python to convert between Python data types and byte representations if needed. 4. **Framing:** The example assumes a very simple framing where the server reads all available bytes. If your `dolphin_mcp` client uses a more sophisticated framing method (e.g., length-prefixed messages), you'll need to modify the `process_request` function in the server to handle the framing correctly. For example, if the first byte is the length, you'd read that byte first, then read the remaining bytes based on the length. 5. **Microcontroller Implementation:** The Python server is a simulation. To run this on a real microcontroller, you'll need to use a language like C or C++ and a microcontroller library that provides serial communication functions. The logic for `process_request` would be the same, but the implementation details would be different. 6. **Testing:** Thoroughly test your implementation with different commands and data values to ensure that it's working correctly. Use a serial port monitor (like PuTTY or RealTerm) to inspect the raw data being sent and received. 7. **Dolphin-MCP Documentation:** The most important thing is to consult the documentation for your `dolphin_mcp` library. It will provide the details you need to use the library correctly. **Example with Length-Prefixed Framing (Server-Side)** If your protocol uses length-prefixed framing (the first byte indicates the length of the message *including* the length byte itself), the `process_request` function in the server would need to be modified: ```python def process_request(request): """Processes an MCP request with length-prefixed framing.""" if len(request) < 1: return bytearray([RESPONSE_ERROR]) # Invalid request (no length byte) message_length = request[0] if len(request) < message_length: return bytearray([RESPONSE_ERROR]) # Incomplete request if message_length < 2: #Need at least opcode return bytearray([RESPONSE_ERROR]) #Invalid request opcode = request[1] # Opcode is the second byte data = request[2:message_length] #The rest is data if opcode == CMD_GET_SENSOR_DATA: # Simulate reading sensor data sensor_value = 42 # Replace with actual sensor reading response_data = bytearray([RESPONSE_OK, sensor_value]) response_length = len(response_data) + 1 # +1 for the length byte response = bytearray([response_length]) + response_data return response elif opcode == CMD_SET_LED_STATE: if len(data) != 1: return bytearray([RESPONSE_ERROR]) #Invalid data length led_state = data[0] # Second byte is the LED state (0 or 1) if led_state == 0: print("LED OFF") elif led_state == 1: print("LED ON") else: return bytearray([RESPONSE_ERROR]) # Invalid LED state response_data = bytearray([RESPONSE_OK]) response_length = len(response_data) + 1 response = bytearray([response_length]) + response_data return response elif opcode == CMD_GET_VERSION: version = 123 #Simulate version number response_data = bytearray([RESPONSE_OK, version]) response_length = len(response_data) + 1 response = bytearray([response_length]) + response_data return response else: # Unknown command return bytearray([RESPONSE_ERROR]) ``` In this case, the client would need to construct the request with the length byte at the beginning. For example: ```python # Example of sending a length-prefixed request request_data = bytearray([CMD_SET_LED_STATE, 1]) # Opcode and LED state request_length = len(request_data) + 1 # Length of data + length byte itself request = bytearray([request_length]) + request_data # Now 'request' contains the length-prefixed message ``` Remember to adapt the code to match the specific requirements of your `dolphin_mcp` library and the MCP protocol you're using. Good luck!

MCP Servers and Tools I Use

MCP Servers and Tools I Use

MCPサーバーと、Claudeで使用するツールに関するドキュメント

PostgreSQL MCP Server

PostgreSQL MCP Server

Enables direct execution of PostgreSQL queries through the Model Context Protocol. Supports both SELECT and non-SELECT operations with dual communication modes (stdio and HTTP REST API).

Hello MCP Server

Hello MCP Server

yunxin-mcp-server

yunxin-mcp-server

ユンシン-mcp-サーバー (Yunshin-mcp-sābā)

Lunar Calendar Mcp

Lunar Calendar Mcp

MCP Toolkit

MCP Toolkit

A modular toolkit for building extensible tool services that fully supports the MCP 2024-11-05 specification, offering file operations, terminal execution, and network requests.