Discover Awesome MCP Servers

Extend your agent with 28,410 capabilities via MCP servers.

All28,410
Brickognize MCP Server

Brickognize MCP Server

Identifies LEGO parts, sets, and minifigures from local image files using the Brickognize API. It provides specialized tools for specific item recognition and integrates LEGO identification capabilities into MCP-enabled environments.

stacksfinder-mcp

stacksfinder-mcp

Tech stack recommendations for developers. Deterministic 6-dimension scoring across 30+ technologies. 4 free tools, Pro features with API key.

RhinoCommon MCP

RhinoCommon MCP

Enables Claude to access Rhino 8 RhinoCommon API documentation for accurate code generation when developing Rhino plugins, providing class details, method signatures, and code examples.

MCP MongoDB Integration

MCP MongoDB Integration

Este proyecto demuestra la integración de MongoDB con el Protocolo de Contexto del Modelo (MCP) para proporcionar a los asistentes de IA capacidades de interacción con bases de datos.

MCP GDB Server

MCP GDB Server

Proporciona funcionalidad de depuración GDB para usar con Claude u otros asistentes de IA, permitiendo a los usuarios gestionar sesiones de depuración, establecer puntos de interrupción, examinar variables y ejecutar comandos GDB a través del lenguaje natural.

Fugle MCP Server

Fugle MCP Server

MCP Geometry Server

MCP Geometry Server

An MCP server that enables AI models to generate precise geometric images by providing Asymptote code, supporting both SVG and PNG output formats.

Argus

Argus

One endpoint, five search providers. Search broker for AI agents with automatic fallback, RRF ranking, and budget enforcement. The LiteLLM of web search.

Unloop MCP

Unloop MCP

Detects and breaks repetitive fix loops in AI coding assistants by tracking attempts and providing escalating intervention strategies. It utilizes error fingerprinting and similarity analysis to redirect the AI toward new approaches when it gets stuck on the same error.

mobile-device-mcp

mobile-device-mcp

MCP server that gives AI coding assistants the ability to see and interact with mobile devices. 49 tools for Android/iOS — AI-powered visual analysis (Claude + Gemini), smart tap/type by description, Flutter widget tree inspection, video recording, and test script generation. 4-tier element search with <1ms local matching. Free tier included, zero setup via npx.

21st.dev Magic AI Agent

21st.dev Magic AI Agent

A powerful AI-driven tool that helps developers create beautiful, modern UI components instantly through natural language descriptions.

Intervals.icu MCP Server

Intervals.icu MCP Server

Espejo de

Icypeas MCP Server

Icypeas MCP Server

A Model Context Protocol server that integrates with the Icypeas API to help users find work emails based on name and company information.

Comedy MCP Server

Comedy MCP Server

Okay, I understand. You want to create an MCP (presumably Minecraft Protocol) server using the C# SDK (likely a library like Minecraft.Net or similar) that automatically enhances code comments with jokes retrieved from the JokeAPI. Here's a breakdown of the concept, potential code structure, considerations, and a basic example. Keep in mind this is a complex project and this is a high-level outline. You'll need to adapt it to your specific Minecraft server implementation and C# SDK. **Conceptual Outline** 1. **Minecraft Server Setup (MCP & C# SDK):** * You'll need a working Minecraft server implementation using a C# SDK. This is the foundation. I'm assuming you already have this or are in the process of setting it up. The specific SDK you use will dictate how you handle player connections, chat, and server events. * Understand how your chosen SDK handles chat messages and server commands. You'll need to intercept or modify chat messages to inject the jokes. 2. **JokeAPI Integration:** * Use the `HttpClient` class in C# to make requests to the JokeAPI. * Parse the JSON response from the JokeAPI to extract the joke. * Handle different joke types (single, two-part). * Implement error handling for API requests (e.g., network issues, API downtime). 3. **Comment Detection and Augmentation:** * **This is the tricky part.** You need a way to detect when a player is *likely* entering a comment. Minecraft doesn't have a formal "comment" system in chat. You'll have to use heuristics. Here are some ideas: * **Prefix Detection:** Look for common comment prefixes like `//`, `#`, `/*`, `*/`, or similar. Players would have to use these prefixes. * **Command-Based Comments:** Create a custom server command (e.g., `/comment <text>`) that signals a comment. This is the most reliable approach. * **Keyword Detection:** Look for keywords like "note:", "comment:", "todo:", etc. * **Contextual Analysis (Advanced):** Attempt to analyze the chat message for code-like syntax (e.g., variable names, operators) and assume it's a comment if it looks like code. This is very complex and prone to errors. * Once a comment is detected, fetch a joke from the JokeAPI. * Format the joke and append it to the comment. * Send the modified message back to the player (or broadcast it to the server, depending on your goal). 4. **Configuration:** * Allow configuration of the comment prefix, joke categories (e.g., programming, dark, pun), and other settings. Use a configuration file (e.g., JSON, XML) or a simple text file. **Example Code Snippet (Illustrative - Adapt to Your SDK)** ```csharp using System; using System.Net.Http; using System.Text.Json; using System.Threading.Tasks; public class JokeHandler { private static readonly HttpClient client = new HttpClient(); private const string JokeApiUrl = "https://v2.jokeapi.dev/joke/Programming,Christmas?blacklistFlags=nsfw,racist,sexist,explicit&safe-mode"; // Example URL public static async Task<string> GetJokeAsync() { try { HttpResponseMessage response = await client.GetAsync(JokeApiUrl); response.EnsureSuccessStatusCode(); // Throw exception if not successful string responseBody = await response.Content.ReadAsStringAsync(); JsonDocument jsonDocument = JsonDocument.Parse(responseBody); JsonElement root = jsonDocument.RootElement; if (root.TryGetProperty("error", out JsonElement errorElement) && errorElement.GetBoolean()) { return "Error fetching joke."; } if (root.TryGetProperty("type", out JsonElement typeElement)) { string type = typeElement.GetString(); if (type == "single") { if (root.TryGetProperty("joke", out JsonElement jokeElement)) { return jokeElement.GetString(); } } else if (type == "twopart") { if (root.TryGetProperty("setup", out JsonElement setupElement) && root.TryGetProperty("delivery", out JsonElement deliveryElement)) { return $"{setupElement.GetString()}\n{deliveryElement.GetString()}"; } } } return "Could not parse joke."; } catch (HttpRequestException e) { Console.WriteLine($"Exception: {e.Message}"); return "Error fetching joke."; } catch (JsonException e) { Console.WriteLine($"JSON Exception: {e.Message}"); return "Error parsing joke."; } } } public class MinecraftServerHandler { // Assuming you have a way to intercept chat messages in your SDK public async Task HandleChatMessage(string playerName, string message) { string commentPrefix = "//"; // Example comment prefix if (message.StartsWith(commentPrefix)) { string joke = await JokeHandler.GetJokeAsync(); string augmentedMessage = $"{message} // {joke}"; // Send the augmented message back to the player or broadcast it SendMessageToPlayer(playerName, augmentedMessage); // Replace with your SDK's method } else { // Handle normal messages SendMessageToAll(message); } } // Placeholder methods - replace with your SDK's functions private void SendMessageToPlayer(string playerName, string message) { Console.WriteLine($"Sending to {playerName}: {message}"); // Your SDK's code to send a message to a specific player } private void SendMessageToAll(string message) { Console.WriteLine($"Sending to all: {message}"); // Your SDK's code to broadcast a message to all players } } public class Program { public static async Task Main(string[] args) { // Example usage MinecraftServerHandler serverHandler = new MinecraftServerHandler(); // Simulate a chat message await serverHandler.HandleChatMessage("Player123", "// This is a comment"); await serverHandler.HandleChatMessage("Player456", "Hello, world!"); Console.ReadKey(); } } ``` **Important Considerations:** * **Rate Limiting:** The JokeAPI might have rate limits. Implement proper error handling and potentially caching to avoid exceeding the limits. Consider using a library like Polly for retry policies. * **Asynchronous Operations:** Use `async` and `await` for network requests to avoid blocking the main server thread. * **Error Handling:** Robust error handling is crucial. Catch exceptions when making API requests and gracefully handle errors. Log errors for debugging. * **Security:** Be mindful of security. Don't expose sensitive information in chat messages. Sanitize input to prevent injection attacks. * **Performance:** Fetching jokes from an external API adds latency. Consider caching jokes or using a local joke database to improve performance. * **User Experience:** The augmented comments should be readable and not disruptive. Consider formatting the joke appropriately. * **Configuration:** Make the comment prefix, joke categories, and other settings configurable. * **SDK Specifics:** The code will heavily depend on the specific Minecraft C# SDK you are using. Refer to the SDK's documentation for how to handle chat messages, server commands, and player connections. * **Blacklisting:** The JokeAPI allows you to blacklist certain types of jokes. Use this feature to filter out jokes that are inappropriate for your server. * **Safe Mode:** Enable safe mode in the JokeAPI to further filter jokes. **Steps to Implement:** 1. **Choose a C# Minecraft SDK:** Research and select a suitable SDK for your needs. Popular options include Minecraft.Net (if you're building a custom server from scratch) or libraries that work with existing server platforms. 2. **Set up your Minecraft server:** Get your basic server running with the chosen SDK. 3. **Implement JokeAPI integration:** Create a class (like `JokeHandler` in the example) to handle fetching jokes from the JokeAPI. 4. **Implement comment detection:** Choose a method for detecting comments (prefix, command, etc.). 5. **Integrate comment augmentation:** Modify the chat message handling logic to detect comments, fetch jokes, and augment the messages. 6. **Add configuration:** Allow users to configure the comment prefix, joke categories, and other settings. 7. **Test thoroughly:** Test the implementation thoroughly to ensure it works correctly and doesn't cause any issues with the server. This is a complex project, but by breaking it down into smaller steps, you can gradually build the functionality you need. Remember to consult the documentation for your chosen Minecraft C# SDK and the JokeAPI. Good luck!

Image Converter MCP Server

Image Converter MCP Server

Enables conversion between multiple image formats including JPG, PNG, WebP, GIF, BMP, TIFF, SVG, ICO, and AVIF with quality control and batch processing capabilities.

MCP Terminal & Git Server

MCP Terminal & Git Server

Enables execution of terminal commands, git operations, and automated setup of React, Vue, and Next.js projects with VSCode integration.

Mirdan

Mirdan

Automatically enhances developer prompts with quality requirements, codebase context, and architectural patterns, then orchestrates other MCP servers to ensure AI coding assistants produce high-quality, structured code that follows best practices and security standards.

x64dbg MCP server

x64dbg MCP server

Un servidor MCP para el depurador x64dbg.

Internship Scout & Quality of Life MCP Server

Internship Scout & Quality of Life MCP Server

Integrates Eurostat quality-of-life metrics and real-time job searching to help users find international internships in high-ranking European cities. It enables ranking cities based on personalized criteria like safety or transport and retrieves structured internship listings via the Tavily API.

fal-mcp

fal-mcp

An MCP server that integrates fal.ai's image generation and editing capabilities into MCP-compatible clients. It enables text-to-image generation, style application via LoRAs, and image editing using natural language instructions.

Weather MCP

Weather MCP

Provides weather query capabilities including current weather, daily/hourly forecasts, air quality data, and weather alerts through QWeather API integration with JWT-based authentication.

Zero Network MCP Server

Zero Network MCP Server

Provides AI agents with access to Zero Network documentation, SDK integration guides, and utility tools for crypto-based payments. It enables developers to implement x402 paywalls and per-tool MCP pricing while offering real-time cost estimations and revenue calculations.

Protein MCP Server

Protein MCP Server

Enables searching, retrieving, and downloading protein structure data from the RCSB Protein Data Bank. Supports intelligent protein structure search, comprehensive data retrieval, and multiple file format downloads for bioinformatics research.

MCP with Langchain Sample Setup

MCP with Langchain Sample Setup

Okay, here's a sample setup for a minimal MCP (Message Passing Communication) server and client in Python, designed to be compatible with LangChain. This example focuses on the core communication and doesn't include LangChain-specific logic within the server/client themselves. The idea is that you'd use this communication channel to send data to and from a LangChain agent or chain running on a separate server. **Important Considerations:** * **Simplicity:** This is a basic example. For production, you'd need to add error handling, security (authentication, encryption), more robust message formatting, and potentially asynchronous communication. * **LangChain Integration:** The LangChain part happens *outside* of this code. You'd use the client to send prompts to a LangChain agent running on the server and receive the agent's responses. * **Message Format:** I'm using JSON for simplicity. You could use other formats like Protocol Buffers for better performance and schema validation. * **Threading/Asyncio:** This example uses basic threading. For higher concurrency, consider using `asyncio`. **Code:** ```python import socket import threading import json # Server class MCPServer: def __init__(self, host='localhost', port=12345): self.host = host self.port = port self.server_socket = socket.socket(socket.AF_INET, socket.SOCK_STREAM) self.server_socket.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1) # Allow reuse of address self.clients = [] # Keep track of connected clients def start(self): self.server_socket.bind((self.host, self.port)) self.server_socket.listen(5) # Listen for up to 5 incoming connections print(f"Server listening on {self.host}:{self.port}") while True: client_socket, addr = self.server_socket.accept() print(f"Accepted connection from {addr}") self.clients.append(client_socket) client_thread = threading.Thread(target=self.handle_client, args=(client_socket,)) client_thread.start() def handle_client(self, client_socket): try: while True: data = client_socket.recv(4096) # Receive up to 4096 bytes if not data: break # Client disconnected try: message = json.loads(data.decode('utf-8')) print(f"Received message: {message}") # **LangChain Integration Point:** # Here, you would pass the 'message' to your LangChain agent/chain # and get a response. For example: # response = my_langchain_agent.run(message['prompt']) # response_message = {'response': response} # self.send_message(client_socket, response_message) # For this example, just echo the message back: self.send_message(client_socket, {"response": f"Server received: {message}"}) except json.JSONDecodeError: print("Received invalid JSON data.") self.send_message(client_socket, {"error": "Invalid JSON"}) except Exception as e: print(f"Error handling client: {e}") finally: print(f"Closing connection with {client_socket.getpeername()}") self.clients.remove(client_socket) client_socket.close() def send_message(self, client_socket, message): try: message_json = json.dumps(message) client_socket.sendall(message_json.encode('utf-8')) except Exception as e: print(f"Error sending message: {e}") def stop(self): for client in self.clients: client.close() self.server_socket.close() print("Server stopped.") # Client class MCPClient: def __init__(self, host='localhost', port=12345): self.host = host self.port = port self.client_socket = socket.socket(socket.AF_INET, socket.SOCK_STREAM) def connect(self): try: self.client_socket.connect((self.host, self.port)) print(f"Connected to server at {self.host}:{self.port}") return True except socket.error as e: print(f"Connection error: {e}") return False def send_message(self, message): try: message_json = json.dumps(message) self.client_socket.sendall(message_json.encode('utf-8')) data = self.client_socket.recv(4096) if data: response = json.loads(data.decode('utf-8')) return response else: return None except Exception as e: print(f"Error sending/receiving message: {e}") return None def close(self): self.client_socket.close() print("Connection closed.") # Example Usage (in separate files or at the end of the same file) if __name__ == "__main__": # Server Example server = MCPServer() server_thread = threading.Thread(target=server.start) server_thread.daemon = True # Allow the main thread to exit even if the server thread is running server_thread.start() # Client Example import time time.sleep(1) # Give the server a moment to start client = MCPClient() if client.connect(): message = {"prompt": "What is the capital of France?"} response = client.send_message(message) if response: print(f"Received response: {response}") else: print("No response received.") client.close() server.stop() # Stop the server after the client is done. ``` **Explanation:** 1. **`MCPServer` Class:** * `__init__`: Initializes the server socket, host, and port. `setsockopt` allows reusing the address, which is helpful for quick restarts. * `start`: Binds the socket, listens for connections, and spawns a new thread for each client that connects. * `handle_client`: This is the core of the server. It receives data from the client, decodes it as JSON, and then (crucially) this is where you would integrate with your LangChain agent or chain. The example code just echoes the message back. It also handles JSON decoding errors. * `send_message`: Encodes a message as JSON and sends it to the client. * `stop`: Closes all client connections and the server socket. 2. **`MCPClient` Class:** * `__init__`: Initializes the client socket, host, and port. * `connect`: Connects to the server. * `send_message`: Encodes a message as JSON, sends it to the server, receives the response, decodes the response as JSON, and returns it. * `close`: Closes the client socket. 3. **Example Usage:** * Creates a server instance and starts it in a separate thread (using `threading.Thread`). The `daemon = True` makes the server thread exit when the main thread exits. * Creates a client instance, connects to the server, sends a message (a prompt), receives the response, and prints the response. * Closes the client connection. * Stops the server. **How to Run:** 1. Save the code as a Python file (e.g., `mcp_example.py`). 2. Run the file from your terminal: `python mcp_example.py` You should see output from both the server and the client. The client will send a message, and the server will echo it back. **LangChain Integration (Conceptual):** The key part is in the `handle_client` method of the `MCPServer` class. Instead of just echoing the message, you would do something like this: ```python # Inside the handle_client method: try: message = json.loads(data.decode('utf-8')) print(f"Received message: {message}") # **LangChain Integration:** from langchain.llms import OpenAI # Or your preferred LLM from langchain.chains import LLMChain from langchain.prompts import PromptTemplate # Example using OpenAI and a simple prompt llm = OpenAI(temperature=0.7, openai_api_key="YOUR_OPENAI_API_KEY") # Replace with your API key prompt_template = PromptTemplate.from_template("{prompt}") chain = LLMChain(llm=llm, prompt=prompt_template) response = chain.run(message['prompt']) # Run the LangChain chain response_message = {'response': response} self.send_message(client_socket, response_message) except json.JSONDecodeError: print("Received invalid JSON data.") self.send_message(client_socket, {"error": "Invalid JSON"}) ``` **Important Notes for LangChain:** * **Install LangChain:** `pip install langchain openai` (or other necessary packages). * **API Keys:** You'll need to set up your API keys for the LLMs you're using (e.g., OpenAI). Don't hardcode them directly in the code; use environment variables or a configuration file. * **Error Handling:** Add more robust error handling around the LangChain calls. * **Prompt Engineering:** The quality of your prompts will greatly affect the results. * **Asynchronous Communication (Advanced):** For high-volume scenarios, consider using `asyncio` for both the server and the client to handle multiple requests concurrently. This will significantly improve performance. **Spanish Translation of Key Concepts:** * **Server:** Servidor * **Client:** Cliente * **Message:** Mensaje * **Prompt:** Indicación, Instrucción * **Response:** Respuesta * **Socket:** Zócalo (although "socket" is often used directly in technical contexts) * **Connection:** Conexión * **Thread:** Hilo * **JSON:** JSON (pronounced the same) * **LangChain Agent:** Agente de LangChain * **LangChain Chain:** Cadena de LangChain * **API Key:** Clave API This comprehensive example should give you a solid foundation for building an MCP server and client that can communicate with a LangChain agent. Remember to adapt the code to your specific needs and add the necessary error handling and security measures.

Hurricane Tracker MCP Server

Hurricane Tracker MCP Server

Provides real-time hurricane tracking, 5-day forecast cones, location-based alerts, and historical storm data from NOAA/NHC through MCP tools for AI assistants.

Sequential Questioning MCP Server

Sequential Questioning MCP Server

A specialized server that enables LLMs to gather specific information through sequential questioning, implementing the MCP standard for seamless integration with LLM clients.

Spotinst MCP Server

Spotinst MCP Server

An MCP server for the Spot.io API that enables management of AWS and Azure Ocean clusters across multiple accounts. It provides tools for cluster inventory, node management, cost analysis, and scaling operations through natural language.

Plasmate

Plasmate

Agent-native headless browser for AI agents. Converts web pages to a Semantic Object Model (SOM) instead of raw HTML — 17x average token reduction across real-world sites (up to 117x on complex pages). Native MCP server with fetch_page, extract_text, extract_links, and full browser automation. No API key required.

WuWa MCP Server

WuWa MCP Server

Enables querying detailed information about characters, echoes, and character profiles from the Wuthering Waves game, returning results in LLM-optimized Markdown format.

Html2url

Html2url