Discover Awesome MCP Servers

Extend your agent with 16,059 capabilities via MCP servers.

All16,059
Agent MCP

Agent MCP

A Multi-Agent Collaboration Protocol server that enables coordinated AI collaboration through task management, context sharing, and agent interaction visualization.

FastMCP Demo Server

FastMCP Demo Server

A production-ready MCP server that provides hackathon resources and reusable starter prompts. Built with FastMCP framework and includes comprehensive deployment options for development and production environments.

Outlook MCP Server

Outlook MCP Server

Enables interaction with Outlook email through Microsoft Graph API. Supports email management operations like reading, searching, marking as read/unread, and deleting messages through natural language.

Fetch-Save MCP Server

Fetch-Save MCP Server

A Model Context Protocol server that enables LLMs to retrieve web content and save it to local files for permanent storage and later access.

CityGML MCP 서버

CityGML MCP 서버

MCP GDB Server

MCP GDB Server

Menyediakan fungsionalitas debugging GDB untuk digunakan dengan Claude atau asisten AI lainnya, memungkinkan pengguna untuk mengelola sesi debugging, mengatur breakpoint, memeriksa variabel, dan menjalankan perintah GDB melalui bahasa alami.

Fugle MCP Server

Fugle MCP Server

MCP Geometry Server

MCP Geometry Server

An MCP server that enables AI models to generate precise geometric images by providing Asymptote code, supporting both SVG and PNG output formats.

21st.dev Magic AI Agent

21st.dev Magic AI Agent

A powerful AI-driven tool that helps developers create beautiful, modern UI components instantly through natural language descriptions.

Intervals.icu MCP Server

Intervals.icu MCP Server

Cermin dari

Icypeas MCP Server

Icypeas MCP Server

A Model Context Protocol server that integrates with the Icypeas API to help users find work emails based on name and company information.

Comedy MCP Server

Comedy MCP Server

Okay, here's a breakdown of how you could approach building an MCP (presumably Minecraft Protocol) server using the C# SDK, enhanced with jokes from JokeAPI to add humor to comments or messages: **Conceptual Overview** 1. **Minecraft Protocol (MCP) Server:** You'll need a C# library that handles the Minecraft protocol. Popular options include: * **Minecraft.Net:** The official Minecraft.Net library (often used for authentication and related services, but might not be a full server implementation). * **A custom implementation:** This is a lot of work, but gives you full control. 2. **JokeAPI Integration:** You'll use an HTTP client in C# to make requests to the JokeAPI. You'll parse the JSON response to extract the joke. 3. **Comment/Message Handling:** Your server will need to intercept or process chat messages or comments. This is where you'll inject the jokes. 4. **C# SDK:** This refers to the .NET SDK, which provides the tools and libraries you'll use to build the server. **Code Structure (Illustrative - Requires Adaptation to Your Chosen MCP Library)** ```csharp using System; using System.Net.Http; using System.Text.Json; using System.Threading.Tasks; // Assuming you have a Minecraft server library (replace with your actual library) // using MinecraftServerLibrary; namespace MinecraftJokeServer { public class JokeHandler { private static readonly HttpClient client = new HttpClient(); private const string JokeApiUrl = "https://v2.jokeapi.dev/joke/Programming,Christmas?blacklistFlags=nsfw,racist,sexist,explicit&safe-mode"; // Example JokeAPI endpoint public static async Task<string> GetJokeAsync() { try { HttpResponseMessage response = await client.GetAsync(JokeApiUrl); response.EnsureSuccessStatusCode(); // Throw exception if not successful string responseBody = await response.Content.ReadAsStringAsync(); // Parse the JSON response using (JsonDocument document = JsonDocument.Parse(responseBody)) { JsonElement root = document.RootElement; if (root.TryGetProperty("type", out JsonElement typeElement)) { string type = typeElement.GetString(); if (type == "single") { if (root.TryGetProperty("joke", out JsonElement jokeElement)) { return jokeElement.GetString(); } } else if (type == "twopart") { if (root.TryGetProperty("setup", out JsonElement setupElement) && root.TryGetProperty("delivery", out JsonElement deliveryElement)) { return $"{setupElement.GetString()} {deliveryElement.GetString()}"; } } } return "No joke found."; // Handle cases where the joke structure is unexpected } } catch (HttpRequestException e) { Console.WriteLine($"Exception calling JokeAPI: {e.Message}"); return "Failed to retrieve joke."; } catch (JsonException e) { Console.WriteLine($"Exception parsing JSON: {e.Message}"); return "Failed to parse joke."; } } } public class MinecraftServer { // Replace with your actual server initialization and event handling public async Task StartServer() { Console.WriteLine("Starting Minecraft server..."); // Example: Simulate receiving a chat message string message = "Player1: Hello, world!"; Console.WriteLine($"Received message: {message}"); // Add a joke to the message string joke = await JokeHandler.GetJokeAsync(); string enhancedMessage = $"{message} (Joke: {joke})"; Console.WriteLine($"Enhanced message: {enhancedMessage}"); // Simulate sending the enhanced message to other players Console.WriteLine($"Sending enhanced message to all players."); // Keep the server running (replace with your actual server loop) Console.ReadKey(); } } public class Program { public static async Task Main(string[] args) { MinecraftServer server = new MinecraftServer(); await server.StartServer(); } } } ``` **Explanation and Key Considerations** * **`JokeHandler` Class:** * `HttpClient`: A static `HttpClient` is used for making HTTP requests to the JokeAPI. It's good practice to reuse `HttpClient` instances for performance. * `JokeApiUrl`: The URL of the JokeAPI endpoint. You can customize this to select different joke categories or flags. See the JokeAPI documentation for options. * `GetJokeAsync()`: This asynchronous method fetches a joke from the JokeAPI. * It uses `HttpClient.GetAsync()` to make the request. * `response.EnsureSuccessStatusCode()`: Throws an exception if the HTTP response is not successful (e.g., 404 Not Found, 500 Internal Server Error). * `response.Content.ReadAsStringAsync()`: Reads the JSON response body as a string. * `JsonDocument.Parse()`: Parses the JSON string into a `JsonDocument` for easy access to the data. * The code then checks the `type` property of the JSON response. JokeAPI can return jokes in two formats: "single" (a single joke string) or "twopart" (a setup and a delivery). The code handles both cases. * Error handling: Includes `try-catch` blocks to handle potential exceptions during the HTTP request or JSON parsing. * **`MinecraftServer` Class:** * This class *simulates* a Minecraft server. **You will need to replace the placeholder code with your actual Minecraft server implementation using your chosen MCP library.** * `StartServer()`: This method is where you would initialize your Minecraft server, listen for client connections, and handle events. * The example code simulates receiving a chat message. In a real server, you would get chat messages from the Minecraft protocol. * It calls `JokeHandler.GetJokeAsync()` to get a joke. * It then constructs an "enhanced" message by adding the joke to the original message. * The example code simulates sending the enhanced message to other players. In a real server, you would use the Minecraft protocol to send the message to connected clients. * **`Program` Class:** * The `Main` method is the entry point of the application. It creates an instance of the `MinecraftServer` class and calls the `StartServer()` method. * **Asynchronous Operations:** The code uses `async` and `await` to perform asynchronous operations (e.g., making HTTP requests). This prevents the server from blocking while waiting for the JokeAPI to respond. **Steps to Build and Run** 1. **Create a New C# Project:** Create a new console application project in Visual Studio or your preferred IDE. 2. **Install NuGet Packages:** * `System.Text.Json`: For parsing JSON. (This is usually included by default in newer .NET versions.) 3. **Replace Placeholder Code:** The most important step is to replace the placeholder code in the `MinecraftServer` class with your actual Minecraft server implementation. This will depend on the MCP library you choose. 4. **Implement Minecraft Protocol Handling:** Use your chosen MCP library to handle client connections, authentication, chat messages, and other Minecraft protocol events. 5. **Integrate Joke Handling:** In your chat message handling code, call `JokeHandler.GetJokeAsync()` to get a joke and add it to the message before sending it to other players. 6. **Error Handling:** Add robust error handling to your server to handle potential exceptions, such as network errors, JSON parsing errors, and errors from the Minecraft protocol. 7. **Configuration:** Consider adding configuration options to allow users to customize the JokeAPI endpoint, the frequency of jokes, and other settings. 8. **Build and Run:** Build the project and run the executable. **Important Considerations** * **MCP Library:** Choosing the right MCP library is crucial. Research the available options and choose one that meets your needs in terms of features, performance, and ease of use. Be prepared for a significant learning curve if you're implementing the protocol yourself. * **JokeAPI Rate Limiting:** The JokeAPI may have rate limits. Be sure to respect these limits to avoid being blocked. You might want to implement caching to reduce the number of requests you make to the API. * **Joke Appropriateness:** The JokeAPI allows you to filter jokes based on flags (e.g., `nsfw`, `racist`, `sexist`). Be sure to configure the API to return jokes that are appropriate for your server's audience. Consider adding your own filtering logic to further refine the jokes. * **Performance:** Fetching jokes from the JokeAPI can add latency to your server. Consider caching jokes or using a background thread to fetch jokes so that it doesn't block the main server thread. * **Security:** If your server handles sensitive data (e.g., player authentication), be sure to implement appropriate security measures to protect against attacks. * **User Experience:** Consider how the jokes will be presented to players. You might want to add a command that players can use to request a joke, or you might want to add jokes automatically to certain messages. Make sure the jokes are not too disruptive or annoying. **Example using a hypothetical Minecraft Server Library** ```csharp // Hypothetical Minecraft Server Library // (This is just an example - replace with your actual library) namespace MinecraftServerLibrary { public class MinecraftServer { public event EventHandler<ChatMessageEventArgs> ChatMessageReceived; public void Start() { // ... Server startup logic ... } public void SendMessage(string message, MinecraftPlayer player) { // ... Send message to a player ... } protected virtual void OnChatMessageReceived(ChatMessageEventArgs e) { ChatMessageReceived?.Invoke(this, e); } // Simulate receiving a chat message public void SimulateChatMessage(string message, MinecraftPlayer player) { OnChatMessageReceived(new ChatMessageEventArgs(message, player)); } } public class MinecraftPlayer { public string Name { get; set; } // ... Other player properties ... } public class ChatMessageEventArgs : EventArgs { public string Message { get; set; } public MinecraftPlayer Player { get; set; } public ChatMessageEventArgs(string message, MinecraftPlayer player) { Message = message; Player = player; } } } // Modified MinecraftServer class in your main project using MinecraftServerLibrary; // Assuming you've added the library as a reference namespace MinecraftJokeServer { public class MinecraftServer { private readonly MinecraftServerLibrary.MinecraftServer _server; public MinecraftServer() { _server = new MinecraftServerLibrary.MinecraftServer(); _server.ChatMessageReceived += OnChatMessageReceived; } public async Task StartServer() { Console.WriteLine("Starting Minecraft server..."); _server.Start(); // Simulate a player joining and sending a message MinecraftPlayer player1 = new MinecraftPlayer { Name = "Player1" }; _server.SimulateChatMessage("Hello, world!", player1); Console.ReadKey(); } private async void OnChatMessageReceived(object sender, ChatMessageEventArgs e) { string joke = await JokeHandler.GetJokeAsync(); string enhancedMessage = $"{e.Player.Name}: {e.Message} (Joke: {joke})"; Console.WriteLine($"Enhanced message: {enhancedMessage}"); // Send the enhanced message to all players (replace with actual logic) // _server.SendMessage(enhancedMessage, e.Player); // Example: Send back to the original sender } } } ``` This revised example shows how you might integrate the `JokeHandler` into a more realistic Minecraft server scenario, using a hypothetical `MinecraftServerLibrary`. Remember to replace the hypothetical library with your actual MCP library. The key is to hook into the chat message events provided by your library and then use the `JokeHandler` to enhance the messages. This is a complex project, but by breaking it down into smaller steps and using the right tools, you can create a fun and engaging Minecraft server experience. Good luck!

X MCP Server

X MCP Server

Enables users to interact with X (Twitter) through the X API. Supports posting tweets, retrieving user timelines, searching tweets, and replying to tweets with comprehensive error handling.

LINE Bot MCP Server

LINE Bot MCP Server

Model Context Protocol server implementation that integrates the LINE Messaging API to connect AI agents with LINE Official Accounts, enabling agents to send messages to users.

Display & Video 360 API MCP Server

Display & Video 360 API MCP Server

An MCP server that enables interaction with Google's Display & Video 360 advertising platform API, allowing management of digital advertising campaigns through natural language commands.

MCP MySQL Server

MCP MySQL Server

Enables interaction with MySQL databases (including AWS RDS and cloud instances) through natural language. Supports database connections, query execution, schema inspection, and comprehensive database management operations.

Meraki Magic MCP

Meraki Magic MCP

A Python-based MCP server that enables querying Cisco's Meraki Dashboard API to discover, monitor, and manage Meraki environments.

Cursor Rust Tools

Cursor Rust Tools

Server MCP untuk memungkinkan LLM di Cursor mengakses Rust Analyzer, Crate Docs, dan Perintah Cargo.

Html2url

Html2url

Remote MCP Server

Remote MCP Server

A cloud-based custom MCP server using Azure Functions that enables saving and retrieving code snippets with secure communication through keys, HTTPS, OAuth, and network isolation options.

V2.ai Insights Scraper MCP

V2.ai Insights Scraper MCP

A Model Context Protocol server that scrapes blog posts from V2.ai Insights, extracts content, and provides AI-powered summaries using OpenAI's GPT-4.

MCP with Langchain Sample Setup

MCP with Langchain Sample Setup

Okay, here's a sample setup for an MCP (presumably referring to a **Multi-Client Processing** or **Message Communication Protocol**) server and client, designed to be compatible with LangChain. This example focuses on a basic request-response pattern, suitable for offloading LangChain tasks to a separate process or machine. **Important Considerations:** * **Serialization:** LangChain objects can be complex. You'll need a robust serialization/deserialization method (e.g., `pickle`, `json`, `cloudpickle`) to send data between the client and server. `cloudpickle` is often preferred for its ability to handle more complex Python objects, including functions and classes. * **Security:** If you're running this over a network, consider security implications. Use encryption (e.g., TLS/SSL) and authentication to protect your data. This example omits security for simplicity. * **Error Handling:** Implement comprehensive error handling on both the client and server to gracefully handle exceptions and network issues. * **Asynchronous Operations:** For better performance, especially with LangChain's potentially long-running tasks, consider using asynchronous communication (e.g., `asyncio` in Python). This example uses synchronous communication for clarity. * **Message Format:** Define a clear message format (e.g., JSON with specific keys) for requests and responses. * **LangChain Integration:** The core idea is to send LangChain-related data (prompts, documents, chains, etc.) to the server, have the server process it, and return the result. **Example Implementation (Python):** **1. Server (server.py):** ```python import socket import pickle # Or json, cloudpickle import langchain # Import LangChain on the server from langchain.llms import OpenAI from langchain.chains import LLMChain from langchain.prompts import PromptTemplate # Server configuration HOST = '127.0.0.1' # Standard loopback interface address (localhost) PORT = 65432 # Port to listen on (non-privileged ports are > 1023) def process_langchain_request(data): """ Processes a LangChain request. This is where the LangChain logic resides. """ try: # Assuming 'data' is a dictionary containing necessary information # (e.g., prompt, model parameters, etc.) prompt_text = data.get("prompt") if not prompt_text: return {"error": "Missing prompt in request"} # Example: Using OpenAI (replace with your desired LangChain setup) llm = OpenAI(temperature=0.7) # Replace with your API key or other LLM prompt = PromptTemplate.from_template(prompt_text) chain = LLMChain(llm=llm, prompt=prompt) # Execute the chain result = chain.run(data) # Pass the entire data dictionary as input return {"result": result} except Exception as e: return {"error": str(e)} with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as s: s.bind((HOST, PORT)) s.listen() print(f"Server listening on {HOST}:{PORT}") while True: conn, addr = s.accept() with conn: print(f"Connected by {addr}") try: # Receive data (serialized LangChain request) received_data = conn.recv(4096) # Adjust buffer size as needed if not received_data: continue # No data received, continue listening # Deserialize the data data = pickle.loads(received_data) # Or json.loads, cloudpickle.loads # Process the LangChain request response = process_langchain_request(data) # Serialize the response serialized_response = pickle.dumps(response) # Or json.dumps, cloudpickle.dumps # Send the response back to the client conn.sendall(serialized_response) except Exception as e: print(f"Error processing request: {e}") error_response = pickle.dumps({"error": str(e)}) conn.sendall(error_response) ``` **2. Client (client.py):** ```python import socket import pickle # Or json, cloudpickle # Client configuration HOST = '127.0.0.1' # The server's hostname or IP address PORT = 65432 # The port used by the server def send_langchain_request(prompt, input_data): """ Sends a LangChain request to the server and receives the response. """ with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as s: try: s.connect((HOST, PORT)) # Create the request data (e.g., a dictionary) request_data = { "prompt": prompt, **input_data # Include any other input data for the prompt } # Serialize the request data serialized_data = pickle.dumps(request_data) # Or json.dumps, cloudpickle.dumps # Send the data to the server s.sendall(serialized_data) # Receive the response from the server received_data = s.recv(4096) # Adjust buffer size as needed # Deserialize the response response = pickle.loads(received_data) # Or json.loads, cloudpickle.loads return response except Exception as e: return {"error": str(e)} if __name__ == "__main__": # Example usage prompt = "Tell me a short story about {topic}." input_data = {"topic": "a friendly dragon"} response = send_langchain_request(prompt, input_data) if "error" in response: print(f"Error: {response['error']}") else: print(f"Result: {response['result']}") ``` **Explanation:** * **Server (server.py):** * Creates a socket and listens for incoming connections. * When a client connects, it receives serialized data (assumed to be a LangChain request). * Deserializes the data using `pickle` (or `json`, `cloudpickle`). * Calls `process_langchain_request()` to handle the LangChain logic. This function: * Extracts the prompt and any other necessary data from the received data. * Initializes a LangChain LLM (e.g., OpenAI). **Replace this with your actual LangChain setup.** * Creates a `PromptTemplate` and `LLMChain`. * Runs the chain and returns the result. * Serializes the response and sends it back to the client. * Includes error handling. * **Client (client.py):** * Creates a socket and connects to the server. * Constructs a dictionary containing the prompt and any other input data needed for the LangChain chain. * Serializes the data using `pickle` (or `json`, `cloudpickle`). * Sends the serialized data to the server. * Receives the serialized response from the server. * Deserializes the response. * Prints the result or any error message. **How to Run:** 1. **Install Dependencies:** ```bash pip install langchain openai cloudpickle # Or json, if you prefer ``` 2. **Set your OpenAI API Key (if using OpenAI):** ```python import os os.environ["OPENAI_API_KEY"] = "YOUR_OPENAI_API_KEY" ``` 3. **Run the server:** ```bash python server.py ``` 4. **Run the client:** ```bash python client.py ``` **Indonesian Translation of Key Concepts:** * **Server:** Peladen * **Client:** Klien * **Socket:** Soket * **Serialization:** Serialisasi (mengubah objek menjadi format yang dapat dikirim) * **Deserialization:** Deserialisasi (mengubah format yang dikirim kembali menjadi objek) * **LangChain:** Rantai Bahasa (nama pustaka) * **Prompt:** Perintah * **LLM (Large Language Model):** Model Bahasa Besar * **Request:** Permintaan * **Response:** Tanggapan * **Error Handling:** Penanganan Kesalahan **Important Notes for Indonesian Speakers:** * The code comments are in English for broader understanding, but you can translate them to Indonesian for your own use. * The core logic of LangChain remains the same regardless of the language. The key is to ensure that the data being sent between the client and server is correctly serialized and deserialized. * Consider using Indonesian language models within LangChain if you are primarily working with Indonesian text. This example provides a basic framework. You'll need to adapt it to your specific LangChain use case, including: * Choosing the appropriate serialization method. * Defining a clear message format. * Implementing robust error handling. * Adding security measures if necessary. * Using asynchronous communication for better performance. * Customizing the `process_langchain_request()` function to handle your specific LangChain chains and tasks.

DataForSEO MCP Server

DataForSEO MCP Server

Enables AI assistants to access comprehensive SEO data through DataForSEO APIs, including SERP results, keyword research, backlink analysis, on-page metrics, and domain analytics. Supports real-time search engine data from Google, Bing, and Yahoo with customizable filtering and multiple deployment options.

Continuo Memory System

Continuo Memory System

Enables persistent memory and semantic search for development workflows with hierarchical compression. Store and retrieve development knowledge across IDE sessions using natural language queries, circumventing context window limitations.

MCP Docker Sandbox Interpreter

MCP Docker Sandbox Interpreter

A secure Docker-based environment that allows AI assistants to safely execute code without direct access to the host system by running all code within isolated containers.

HDFS MCP Server by CData

HDFS MCP Server by CData

HDFS MCP Server by CData

Google Search MCP Server

Google Search MCP Server

A Model Context Protocol server that provides web and image search capabilities through Google's Custom Search API, allowing AI assistants like Claude to access current information from the internet.

HaloPSA MCP Server

HaloPSA MCP Server

Enables AI assistants to interact with HaloPSA data through secure OAuth2 authentication. Supports SQL queries against the HaloPSA database, API endpoint exploration, and direct API calls for comprehensive PSA data analysis and management.

Gemini MCP Server

Gemini MCP Server

A Model Context Protocol server that enables LLMs to perform web searches using Google's Gemini API and return synthesized responses with citations.

Spring AI MCP Weather Server Sample with WebMVC Starter

Spring AI MCP Weather Server Sample with WebMVC Starter