Discover Awesome MCP Servers
Extend your agent with 15,692 capabilities via MCP servers.
- All15,692
- Developer Tools3,867
- Search1,714
- Research & Data1,557
- AI Integration Systems229
- Cloud Platforms219
- Data & App Analysis181
- Database Interaction177
- Remote Shell Execution165
- Browser Automation147
- Databases145
- Communication137
- AI Content Generation127
- OS Automation120
- Programming Docs Access109
- Content Fetching108
- Note Taking97
- File Systems96
- Version Control93
- Finance91
- Knowledge & Memory90
- Monitoring79
- Security71
- Image & Video Processing69
- Digital Note Management66
- AI Memory Systems62
- Advanced AI Reasoning59
- Git Management Tools58
- Cloud Storage51
- Entertainment & Media43
- Virtualization42
- Location Services35
- Web Automation & Stealth32
- Media Content Processing32
- Calendar Management26
- Ecommerce & Retail18
- Speech Processing18
- Customer Data Platforms16
- Travel & Transportation14
- Education & Learning Tools13
- Home Automation & IoT13
- Web Search Integration12
- Health & Wellness10
- Customer Support10
- Marketing9
- Games & Gamification8
- Google Cloud Integrations7
- Art & Culture4
- Language Translation3
- Legal & Compliance2
MCP Local File Server
MCP を使用してローカルファイルにアクセスし、操作するためのサーバー
mcp-wdpcameracontrol-server MCP Server
Share MCP - Model Context Protocol MCP Server导航站
Share MCPは、Model Context Protocol(MCP)に特化したナビゲーションサイトです。豊富なMCP関連のリソース、ツール、およびサービスの分類された展示を提供し、開発者が必要なMCPソリューションを迅速に見つけるのに役立ちます。
Bilibili MCP 服务器
MCPサーバー学習 (MCP sābā serbā gakushū)
MCP Workers AI
Cloudflare Workers 向けの MCP サーバー SDK
mcp-client-and-server MCP server
鏡 (Kagami)
MCP Client Configuration Server
鏡 (Kagami)
🎯 Kubernetes MCP Server
AIを活用したMCPサーバーが、Kubernetesクラスターに関する自然言語の問い合わせを理解します。
Code Reviewer Fixer Agent
このAIエージェントは、コードリポジトリを分析し、潜在的なセキュリティ脆弱性を検出し、コード品質をレビューし、SentryとGitHub MCPサーバーを使用してSentryのエラーログに基づいて修正を提案します!
Exa MCP Server 🔍
クロードはWeb検索|ExaをMCP(モデルコンテキストプロトコル)で実行できます。
@modelcontextprotocol/server-terminal
モデルコンテキストプロトコルのためのターミナルサーバー実装
Basilisp nREPL MCP Bridge
シンプルなMCPサーバーをnREPL用に。 (Shinpuru na MCP sābā o nREPL yō ni.)
Zoom MCP Server
Zoom 用の MCP サーバー
Model Context Protocol (MCP) Implementation
スクラッチから構築してMCPを学ぶ
MCP2HTTP
MCP2HTTPは、stdioを使用するMCPクライアントとステートレスなHTTPサーバーを橋渡しする、最小限のトランスポートアダプターです。
Remote MCP Server on Cloudflare
Dockerized Salesforce MCP Server
REST API連携のためのDocker化されたSalesforce MCPサーバー
Modular Outlook MCP Server
Claude が Microsoft Graph API 経由で Outlook データにアクセスするための MCP サーバー
Rails MCP Server
Railsプロジェクト向けのModel Context Protocol (MCP)サーバーのRuby gem実装です。このサーバーにより、LLM(大規模言語モデル)がModel Context Protocolを通じてRailsプロジェクトと連携できるようになります。
mcp-server-datahub
DataHub の公式 MCP サーバー (
Browser JavaScript Evaluator
これは、SSEを介してサーバーに接続し、Claudeがページ上でJavaScriptを実行できるようにするウェブページをホストするMCPサーバーのリファレンスデザインです。
Gmail MCP Server
鏡 (Kagami)
GooseTeam
見て、ガチョウの群れだ!Gooseエージェントの協調のためのMCPサーバーとプロトコル。
Fiberflow MCP Gateway
標準入出力経由で Fiberflow MCP SSE Server を実行してください。
MCPClient Python Application
Okay, I understand. You want an implementation for interacting between an MCP (presumably Minecraft Protocol) server and an Ollama model. This is a complex project, so let's break it down into key components and provide a conceptual outline with code snippets. I'll focus on the core logic and leave out boilerplate code for brevity. **Conceptual Outline** 1. **Minecraft Server Interaction (MCP):** * Establish a connection to the Minecraft server. * Listen for player chat messages. * Parse the chat messages to extract relevant information. * Send commands back to the Minecraft server (e.g., `/say`, `/tell`). 2. **Ollama Model Interaction:** * Send prompts to the Ollama model. * Receive responses from the Ollama model. * Parse the responses. 3. **Bridging Logic:** * Take the parsed chat message from Minecraft. * Formulate a prompt for the Ollama model based on the chat message. * Send the prompt to Ollama. * Receive the response from Ollama. * Format the response for Minecraft. * Send the formatted response back to the Minecraft server. **Code Snippets (Python - Example)** **Important Considerations:** * **Libraries:** You'll need libraries for Minecraft protocol interaction (e.g., `mcstatus`, `nbt`, `pymclevel`, or a more comprehensive library like `python-minecraft-protocol` or `mineflayer` if you need bot-like behavior). You'll also need a library to interact with the Ollama API (e.g., `requests` or `httpx`). * **Ollama Setup:** You need to have Ollama installed and running with the desired model loaded. * **Minecraft Server:** You need a Minecraft server running and accessible. * **Error Handling:** Robust error handling is crucial for a production system. * **Security:** Be very careful about security, especially if you're allowing the Ollama model to execute commands on the Minecraft server. Sanitize inputs and limit the model's capabilities. * **Rate Limiting:** Implement rate limiting to prevent abuse and avoid overwhelming the Ollama model or the Minecraft server. * **Asynchronous Operations:** Use `asyncio` or similar for non-blocking I/O to handle multiple connections and requests efficiently. ```python import asyncio import json import requests # Or httpx # pip install requests # pip install python-minecraft-protocol from mcstatus import MinecraftServer # pip install mcstatus from mcstatus.connection import TCPAsyncSocketConnection # Minecraft Server Configuration MINECRAFT_SERVER_ADDRESS = "localhost" # Replace with your server address MINECRAFT_SERVER_PORT = 25565 # Default Minecraft port # Ollama Configuration OLLAMA_API_URL = "http://localhost:11434/api/generate" # Default Ollama API endpoint OLLAMA_MODEL = "llama2" # Replace with your desired Ollama model async def get_server_status(address, port): """Gets the status of the Minecraft server.""" server = MinecraftServer(address, port) try: status = await server.async_status() return status except Exception as e: print(f"Error getting server status: {e}") return None async def send_minecraft_command(command): """Sends a command to the Minecraft server (example using rcon).""" # This is a placeholder. You'll need to implement RCON or another method # to send commands to the server. RCON is generally preferred. # Example using an RCON library (e.g., mcrcon): # from mcrcon import MCRcon # with MCRcon(MINECRAFT_SERVER_ADDRESS, RCON_PORT, RCON_PASSWORD) as mcr: # resp = mcr.command(command) # print(resp) print(f"Simulating sending command to Minecraft: {command}") pass # Replace with actual implementation async def send_to_ollama(prompt): """Sends a prompt to the Ollama model and returns the response.""" payload = { "prompt": prompt, "model": OLLAMA_MODEL, "stream": False # Set to True for streaming responses } try: response = requests.post(OLLAMA_API_URL, json=payload, stream=False) # stream=False for complete response response.raise_for_status() # Raise HTTPError for bad responses (4xx or 5xx) data = response.json() return data["response"] except requests.exceptions.RequestException as e: print(f"Error sending to Ollama: {e}") return None async def process_minecraft_chat(chat_message): """Processes a Minecraft chat message and interacts with Ollama.""" print(f"Received chat message: {chat_message}") # 1. Formulate a prompt for Ollama ollama_prompt = f"Respond to this Minecraft chat message: {chat_message}" # 2. Send the prompt to Ollama ollama_response = await send_to_ollama(ollama_prompt) if ollama_response: print(f"Ollama response: {ollama_response}") # 3. Format the response for Minecraft minecraft_response = f"Ollama: {ollama_response}" # 4. Send the response back to Minecraft await send_minecraft_command(f"/say {minecraft_response}") # Or /tell <player> else: print("Ollama request failed.") async def minecraft_chat_listener(): """Listens for chat messages from the Minecraft server.""" # This is a placeholder. You'll need to implement a proper Minecraft # client to listen for chat messages. This is significantly more complex # than just getting the server status. Libraries like # `python-minecraft-protocol` or `mineflayer` are essential. # Example (very simplified and incomplete - DO NOT USE IN PRODUCTION): # This just simulates receiving chat messages. while True: # Simulate receiving a chat message await asyncio.sleep(5) # Check every 5 seconds (adjust as needed) simulated_chat = input("Enter simulated chat message: ") await process_minecraft_chat(simulated_chat) async def main(): """Main function to start the Minecraft and Ollama interaction.""" status = await get_server_status(MINECRAFT_SERVER_ADDRESS, MINECRAFT_SERVER_PORT) if status: print(f"Minecraft server status: {status.description}") asyncio.create_task(minecraft_chat_listener()) # Start listening for chat else: print("Failed to get Minecraft server status. Exiting.") return # Keep the main task running while True: await asyncio.sleep(1) if __name__ == "__main__": asyncio.run(main()) ``` **Explanation and Key Improvements:** * **`mcstatus` for Server Status:** Uses `mcstatus` to check the server's status. This is a good starting point to verify the server is running. * **`requests` for Ollama:** Uses the `requests` library to interact with the Ollama API. This is a simple and common way to make HTTP requests. Consider `httpx` for async. * **`send_to_ollama` Function:** Encapsulates the logic for sending prompts to Ollama and handling responses. Includes error handling. * **`process_minecraft_chat` Function:** Handles the core logic of taking a Minecraft chat message, formulating a prompt, sending it to Ollama, and sending the response back to Minecraft. * **`send_minecraft_command` Placeholder:** A placeholder function for sending commands to the Minecraft server. **You MUST replace this with a proper implementation using RCON or another suitable method.** * **`minecraft_chat_listener` Placeholder:** A placeholder function for listening for chat messages. **This is the most complex part and requires a full Minecraft client implementation.** Libraries like `python-minecraft-protocol` or `mineflayer` are essential. The example provided is just a simulation. * **Asynchronous Operations:** Uses `asyncio` for concurrent operations. This is important for handling multiple connections and requests efficiently. * **Error Handling:** Includes basic error handling for network requests and server status checks. Expand this for production use. * **Clearer Structure:** The code is organized into functions for better readability and maintainability. * **Comments:** Includes comments to explain the purpose of each section of the code. * **Important Considerations:** Highlights the critical aspects of the project, such as security, rate limiting, and library choices. **Next Steps (Crucial):** 1. **Implement Minecraft Chat Listener:** This is the most challenging part. Use a library like `python-minecraft-protocol` or `mineflayer` to create a Minecraft client that can connect to the server, authenticate, and listen for chat messages. This will involve understanding the Minecraft protocol. 2. **Implement Minecraft Command Sender:** Implement the `send_minecraft_command` function using RCON (Remote Console) or another suitable method to send commands to the Minecraft server. RCON is generally the preferred approach. You'll need to enable RCON on your Minecraft server and configure the RCON port and password. 3. **Security:** Thoroughly review the security implications of allowing an AI model to interact with your Minecraft server. Sanitize inputs, limit the model's capabilities, and implement appropriate access controls. 4. **Testing:** Test the integration thoroughly in a safe environment before deploying it to a production server. 5. **Refinement:** Refine the prompts and responses to improve the quality of the interaction between the AI model and the Minecraft players. **Example `python-minecraft-protocol` (Partial - Requires More Setup):** ```python # Requires: pip install python-minecraft-protocol from minecraft_protocol.connection import Connection import asyncio async def handle_chat(packet): """Handles incoming chat messages.""" chat_data = json.loads(packet.json_data) message = chat_data["text"] if "extra" in chat_data: for extra in chat_data["extra"]: if isinstance(extra, str): message += extra elif isinstance(extra, dict) and "text" in extra: message += extra["text"] await process_minecraft_chat(message) async def minecraft_chat_listener(): """Listens for chat messages using python-minecraft-protocol.""" conn = Connection( host=MINECRAFT_SERVER_ADDRESS, port=MINECRAFT_SERVER_PORT, username="YourBotUsername", # Replace with your bot's username password="YourBotPassword" # If applicable ) conn.register_packet_listener(handle_chat, "chat.incoming") try: await conn.connect() await conn.wait_for_disconnect() # Keep the connection alive except Exception as e: print(f"Error connecting to Minecraft server: {e}") # Replace the placeholder in main() with: # asyncio.create_task(minecraft_chat_listener()) ``` **Important Notes about `python-minecraft-protocol`:** * **Authentication:** You'll need to handle Minecraft authentication correctly. This can be complex, especially for premium (paid) Minecraft accounts. Consider using a library that simplifies authentication. * **Protocol Version:** Ensure that the library you use supports the correct Minecraft protocol version for your server. * **Error Handling:** Implement robust error handling to deal with connection issues, authentication failures, and other potential problems. This comprehensive outline and the code snippets should provide a solid foundation for building your MCP-Ollama integration. Remember to prioritize security, error handling, and thorough testing throughout the development process. Good luck!
mcp_server_local_files
ローカルファイルシステム MCP サーバー
MCP Expert Server
鏡 (Kagami)
iOS Simulator MCP Server
鏡 (Kagami)
Supergateway
MCP の標準入出力 (stdio) サーバーを SSE 経由で実行し、SSE を標準入出力経由で実行する。AI ゲートウェイ。
NSAF MCP Server
鏡 (Kagami)