Discover Awesome MCP Servers
Extend your agent with 17,807 capabilities via MCP servers.
- All17,807
- Developer Tools3,867
- Search1,714
- Research & Data1,557
- AI Integration Systems229
- Cloud Platforms219
- Data & App Analysis181
- Database Interaction177
- Remote Shell Execution165
- Browser Automation147
- Databases145
- Communication137
- AI Content Generation127
- OS Automation120
- Programming Docs Access109
- Content Fetching108
- Note Taking97
- File Systems96
- Version Control93
- Finance91
- Knowledge & Memory90
- Monitoring79
- Security71
- Image & Video Processing69
- Digital Note Management66
- AI Memory Systems62
- Advanced AI Reasoning59
- Git Management Tools58
- Cloud Storage51
- Entertainment & Media43
- Virtualization42
- Location Services35
- Web Automation & Stealth32
- Media Content Processing32
- Calendar Management26
- Ecommerce & Retail18
- Speech Processing18
- Customer Data Platforms16
- Travel & Transportation14
- Education & Learning Tools13
- Home Automation & IoT13
- Web Search Integration12
- Health & Wellness10
- Customer Support10
- Marketing9
- Games & Gamification8
- Google Cloud Integrations7
- Art & Culture4
- Language Translation3
- Legal & Compliance2
n8n Workflow Builder MCP Server
MCP server for Claude / Cursor building n8n workflow
MCP Google Calendar
A Model Context Protocol (MCP) server for Google Calendar integration with Claude and other AI assistants.
๐ฏ Kubernetes MCP Server
Server MCP bertenaga AI memahami kueri bahasa alami tentang klaster Kubernetes Anda
MCP Servers Hub
Temukan Server dan Klien MCP yang menarik.
Bilibili MCP ๆๅกๅจ
That translates to: **mcp-server belajar** Which means "mcp-server learning" or "learning mcp-server".
MCP Workers AI
MCP servers sdk for Cloudflare Workers
Modular Outlook MCP Server
MCP server for Claude to access Outlook data via Microsoft Graph API
mcp-server-datahub
Server MCP resmi untuk DataHub (
Rails MCP Server
A Ruby gem implementation of a Model Context Protocol (MCP) server for Rails projects. This server allows LLMs (Large Language Models) to interact with Rails projects through the Model Context Protocol.
mcp_server_local_files
Local File System MCP Server
MCP Expert Server
Mirror of
Exa MCP Server ๐
Claude dapat melakukan Pencarian Web | Exa dengan MCP (Protokol Konteks Model).
@modelcontextprotocol/server-terminal
Terminal server implementation for Model Context Protocol
Model Context Protocol (MCP) Implementation
Learn MCP by building from Scarch
MCP2HTTP
MCP2HTTP is a minimal transport adapter that bridges MCP clients using stdio with stateless HTTP servers.
Remote MCP Server on Cloudflare
Basilisp nREPL MCP Bridge
simple MCP server for nREPL
Zoom MCP Server
MCP server for Zoom
Gmail MCP Server
Mirror of
generator-mcp
Yeoman Generator to quickly create a new MCP Server
MCP Server Playwright
MCP Server Playwright - Layanan otomatisasi browser untuk Claude Desktop
iOS Simulator MCP Server
Mirror of
Supergateway
Run MCP stdio servers over SSE and SSE over stdio. AI gateway.
NSAF MCP Server
Mirror of
Fiberflow MCP Gateway
Run Fiberflow MCP SSE Server over stdio.
MCPClient Python Application
Okay, I understand. You want an implementation for interacting between an MCP (presumably Minecraft Protocol) server and an Ollama model. This is a complex task that involves several steps. Here's a breakdown of the concepts, potential approaches, and a simplified example to get you started. Keep in mind that a full, production-ready implementation would be quite extensive. **Core Concepts** * **Minecraft Protocol (MCP):** This is the communication protocol used between Minecraft clients and servers. It's binary and relatively complex. You'll need a library to handle the encoding and decoding of Minecraft packets. * **Ollama:** This is a tool that allows you to run large language models (LLMs) locally. You interact with Ollama via its API, typically using HTTP requests. * **Bridge/Middleware:** You'll need a piece of software (the "bridge") that sits between the Minecraft server and the Ollama model. This bridge will: * Receive messages from the Minecraft server (e.g., player chat). * Format those messages into a prompt for the Ollama model. * Send the prompt to the Ollama API. * Receive the response from Ollama. * Format the response into a Minecraft message. * Send the message back to the Minecraft server. **High-Level Architecture** 1. **Minecraft Server:** The standard Minecraft server. 2. **MCP Library:** A library (e.g., `mcproto`, `python-minecraft-protocol`, `node-minecraft-protocol`) to handle Minecraft packet encoding/decoding. This will be used in the bridge. 3. **Bridge (Your Code):** The core of the integration. This will: * Connect to the Minecraft server. * Listen for specific packets (e.g., chat messages). * Process the messages. * Interact with the Ollama API. * Send messages back to the Minecraft server. 4. **Ollama Server:** Running locally or on a server. You'll need to have a model loaded in Ollama (e.g., `llama2`, `mistral`). **Implementation Steps** 1. **Choose a Programming Language and Libraries:** * **Python:** Popular choice due to its ease of use and available libraries. Libraries like `mcproto`, `requests`, and `asyncio` are helpful. * **Node.js:** Another good option with libraries like `node-minecraft-protocol` and `node-fetch`. * **Go:** A performant option, but might require more manual work with the MCP. 2. **Set up Ollama:** * Install Ollama: Follow the instructions on the Ollama website ([https://ollama.com/](https://ollama.com/)). * Pull a model: `ollama pull llama2` (or another model of your choice). * Run the model: `ollama run llama2` (This starts the Ollama API server). 3. **Implement the Bridge:** This is the most complex part. Here's a simplified Python example using `mcproto` and `requests`: ```python import asyncio import mcproto import json import requests # Minecraft Server Configuration SERVER_HOST = "localhost" # Or your server's IP address SERVER_PORT = 25565 USERNAME = "OllamaBot" # Bot's username # Ollama API Configuration OLLAMA_API_URL = "http://localhost:11434/api/generate" # Default Ollama API endpoint OLLAMA_MODEL = "llama2" # The model you're using in Ollama async def handle_chat_message(client, message): """Handles incoming chat messages from the Minecraft server.""" try: chat_data = json.loads(message) text = chat_data['content'] # Extract the actual message text print(f"Received chat message: {text}") # Create a prompt for Ollama prompt = f"Respond to the following Minecraft chat message as a helpful and friendly player: {text}" # Send the prompt to Ollama response = requests.post(OLLAMA_API_URL, json={ "prompt": prompt, "model": OLLAMA_MODEL, "stream": False # Get the full response at once }) response.raise_for_status() # Raise an exception for bad status codes (4xx or 5xx) ollama_response = response.json() bot_response = ollama_response['response'] print(f"Ollama response: {bot_response}") # Format the response as a Minecraft chat message minecraft_message = f"[Ollama]: {bot_response}" # Send the message back to the Minecraft server await send_chat_message(client, minecraft_message) except json.JSONDecodeError as e: print(f"Error decoding JSON: {e}") except requests.exceptions.RequestException as e: print(f"Error communicating with Ollama: {e}") except Exception as e: print(f"An unexpected error occurred: {e}") async def send_chat_message(client, message): """Sends a chat message to the Minecraft server.""" await client.send("chat_message", {"message": message, "position": 0}) # Position 0 is chat async def main(): """Main function to connect to the Minecraft server and handle messages.""" try: client = await mcproto.create_client(SERVER_HOST, SERVER_PORT, username=USERNAME) print(f"Connected to Minecraft server as {USERNAME}") # Register a handler for incoming chat messages client.register("chat_message", handle_chat_message) # Keep the client running await client.run() except mcproto.exceptions.MinecraftError as e: print(f"Minecraft error: {e}") except Exception as e: print(f"An unexpected error occurred: {e}") if __name__ == "__main__": asyncio.run(main()) ``` **Explanation of the Python Example:** * **Imports:** Imports necessary libraries. * **Configuration:** Sets up the server address, port, username, Ollama API URL, and model name. *Crucially, change these to match your setup.* * **`handle_chat_message`:** * Receives a chat message from the Minecraft server. * Extracts the text of the message. * Constructs a prompt for Ollama. *This is where you can customize how you want the model to respond.* * Sends the prompt to the Ollama API using `requests.post`. * Parses the JSON response from Ollama. * Formats the response into a Minecraft chat message. * Sends the message back to the Minecraft server using `send_chat_message`. * **`send_chat_message`:** Sends a chat message to the Minecraft server using the `chat_message` packet. * **`main`:** * Connects to the Minecraft server using `mcproto.create_client`. * Registers the `handle_chat_message` function to be called when a `chat_message` packet is received. * Runs the client using `client.run()`. **To Run the Example:** 1. **Install Libraries:** `pip install mcproto requests` 2. **Start Ollama:** `ollama run llama2` (or your chosen model) 3. **Run the Python script:** `python your_script_name.py` 4. **Join your Minecraft server with a client.** 5. **Send a chat message in Minecraft.** The bot should respond. **Important Considerations and Next Steps:** * **Error Handling:** The example includes basic error handling, but you'll need to add more robust error handling for production use. Consider logging errors to a file. * **Prompt Engineering:** The prompt is crucial for getting good responses from the LLM. Experiment with different prompts to get the desired behavior. Consider adding context to the prompt (e.g., the player's name, the current game state). * **Rate Limiting:** Ollama has rate limits. Implement rate limiting in your bridge to avoid overloading the Ollama API. * **Security:** Be careful about the prompts you send to the LLM. Avoid sending sensitive information. Sanitize user input to prevent prompt injection attacks. * **Asynchronous Operations:** Use `asyncio` (or similar) to handle network operations concurrently. This will prevent the bridge from blocking while waiting for responses from the Minecraft server or the Ollama API. * **Minecraft Protocol Version:** Make sure your `mcproto` (or equivalent) library is compatible with the Minecraft server version you're using. * **Authentication:** If your Minecraft server requires authentication, you'll need to implement the authentication handshake in your bridge. * **Configuration:** Use a configuration file (e.g., JSON, YAML) to store the server address, port, username, Ollama API URL, and other settings. * **More Sophisticated Interactions:** Instead of just responding to chat messages, you could use the LLM to: * Generate quests. * Create dynamic storylines. * Control non-player characters (NPCs). * Analyze player behavior. * **Context Management:** LLMs often perform better with context. Consider storing a history of recent chat messages and including them in the prompt. Be mindful of the LLM's context window (the maximum length of the prompt). * **Filtering:** Implement filtering to prevent the LLM from generating inappropriate or offensive content. This is a starting point. Building a fully functional bridge between a Minecraft server and an Ollama model requires significant effort and experimentation. Good luck!
Spring AI MCP Server ็คบไพ้กน็ฎ
Google Drive & Sheets MCP Server
A Model Context Protocol (MCP) server built in Rust for interacting with Google Drive and Google Sheets.
๐ LinkedIn DI MCP Server
Audiense Digital Intelligence LinkedIn MCP Server adalah server yang didasarkan pada Model Context Protocol (MCP) yang memungkinkan Claude dan klien lain yang kompatibel dengan MCP untuk berinteraksi dengan akun DI LinkedIn Anda oleh Audiense.
dice-mcp-server