Discover Awesome MCP Servers
Extend your agent with 23,495 capabilities via MCP servers.
- All23,495
- Developer Tools3,867
- Search1,714
- Research & Data1,557
- AI Integration Systems229
- Cloud Platforms219
- Data & App Analysis181
- Database Interaction177
- Remote Shell Execution165
- Browser Automation147
- Databases145
- Communication137
- AI Content Generation127
- OS Automation120
- Programming Docs Access109
- Content Fetching108
- Note Taking97
- File Systems96
- Version Control93
- Finance91
- Knowledge & Memory90
- Monitoring79
- Security71
- Image & Video Processing69
- Digital Note Management66
- AI Memory Systems62
- Advanced AI Reasoning59
- Git Management Tools58
- Cloud Storage51
- Entertainment & Media43
- Virtualization42
- Location Services35
- Web Automation & Stealth32
- Media Content Processing32
- Calendar Management26
- Ecommerce & Retail18
- Speech Processing18
- Customer Data Platforms16
- Travel & Transportation14
- Education & Learning Tools13
- Home Automation & IoT13
- Web Search Integration12
- Health & Wellness10
- Customer Support10
- Marketing9
- Games & Gamification8
- Google Cloud Integrations7
- Art & Culture4
- Language Translation3
- Legal & Compliance2
mcp_sdk_petstore_api_44
A standalone MCP server generated from an OpenAPI specification that exposes Petstore API endpoints as tools for AI assistants. It utilizes SSE transport to enable models to interact with pet store management functionalities through natural language.
MCP SQL Server Pro
Provides direct SQL query access to Microsoft SQL Server databases with full CRUD operations, enabling AI assistants to execute queries, modify data, and manage database objects through a simplified interface.
MCP Crypto Portfolio
Connects Claude AI to KuCoin portfolio and Notion workspace for real-time crypto portfolio management, automated reporting, and AI-powered risk analysis with enterprise-grade security and observability.
Elasticsearch MCP Server by CData
Elasticsearch MCP Server by CData
Image Converter MCP Server
Enables conversion between multiple image formats including JPG, PNG, WebP, GIF, BMP, TIFF, SVG, ICO, and AVIF with quality control and batch processing capabilities.
MCP Terminal & Git Server
Enables execution of terminal commands, git operations, and automated setup of React, Vue, and Next.js projects with VSCode integration.
Mirdan
Automatically enhances developer prompts with quality requirements, codebase context, and architectural patterns, then orchestrates other MCP servers to ensure AI coding assistants produce high-quality, structured code that follows best practices and security standards.
FPF Agent Stack
Enables offline AI agent automation with embedded local LLM (Qwen 2.5), sandboxed file operations through AgentFS, and dynamic skill loading. Exposes capabilities via MCP with tri-state safety guards for private, air-gapped environments without network connectivity or API costs.
x64dbg MCP server
x64dbg デバッガー用の MCP サーバー
Internship Scout & Quality of Life MCP Server
Integrates Eurostat quality-of-life metrics and real-time job searching to help users find international internships in high-ranking European cities. It enables ranking cities based on personalized criteria like safety or transport and retrieves structured internship listings via the Tavily API.
Weather MCP
Provides weather query capabilities including current weather, daily/hourly forecasts, air quality data, and weather alerts through QWeather API integration with JWT-based authentication.
Remote MCP Server
A cloud-based custom MCP server using Azure Functions that enables saving and retrieving code snippets with secure communication through keys, HTTPS, OAuth, and network isolation options.
V2.ai Insights Scraper MCP
A Model Context Protocol server that scrapes blog posts from V2.ai Insights, extracts content, and provides AI-powered summaries using OpenAI's GPT-4.
Claude Agents MCP Server
Centrally manages Claude agent definitions, configurations, and custom commands across multiple devices using a SQLite database, eliminating file synchronization conflicts and enabling live updates across all connected Claude sessions.
MCP Server for MySQL
Provides access to MySQL databases with fine-grained access control, supporting multiple databases simultaneously with configurable access modes (readonly, readwrite, full) and table-level permissions using whitelists, blacklists, wildcards, and regex patterns.
Markdown MCP Server
An MCP (Model Context Protocol) server for efficiently managing Markdown documents in Cursor AI IDE, supporting CRUD operations, search, and metadata management.
MCP with Langchain Sample Setup
Okay, here's a sample setup for an MCP (presumably meaning "Message Passing Communication" or similar) server and client, designed to be compatible with LangChain. This example focuses on a simple request-response pattern using Python and a basic socket implementation. It prioritizes clarity and demonstrates the core concepts. You'll likely need to adapt it based on your specific MCP protocol and LangChain use case. **Important Considerations:** * **Error Handling:** This is a simplified example. Robust error handling (try-except blocks, connection timeouts, etc.) is crucial for production environments. * **Security:** Plain sockets are inherently insecure. For sensitive data, use TLS/SSL (e.g., `ssl.wrap_socket`). * **Serialization:** This example uses simple string encoding. For complex data structures, consider using `json`, `pickle`, or `protobuf` for serialization/deserialization. Choose a method that's efficient and secure. * **Asynchronous Communication:** For high-performance applications, consider using asynchronous libraries like `asyncio` instead of blocking sockets. * **LangChain Integration:** The LangChain integration is conceptual. You'll need to adapt the `process_request` function to interact with your LangChain components (e.g., chains, agents, memory). * **MCP Protocol Definition:** Clearly define your MCP protocol (message format, commands, error codes) for reliable communication. **Python Code:** ```python import socket import threading import json # For serialization (optional) # --- Server --- class MCPServer: def __init__(self, host='localhost', port=12345): self.host = host self.port = port self.server_socket = None self.running = False def start(self): self.server_socket = socket.socket(socket.AF_INET, socket.SOCK_STREAM) self.server_socket.bind((self.host, self.port)) self.server_socket.listen(5) # Listen for up to 5 incoming connections print(f"Server listening on {self.host}:{self.port}") self.running = True while self.running: try: client_socket, addr = self.server_socket.accept() print(f"Accepted connection from {addr}") client_thread = threading.Thread(target=self.handle_client, args=(client_socket,)) client_thread.start() except OSError: # Socket was closed (e.g., during shutdown) break def handle_client(self, client_socket): try: while True: data = client_socket.recv(1024).decode('utf-8') # Receive up to 1024 bytes if not data: break # Client disconnected print(f"Received: {data}") response = self.process_request(data) # Process the request (LangChain integration here) client_socket.sendall(response.encode('utf-8')) # Send the response except Exception as e: print(f"Error handling client: {e}") finally: client_socket.close() print("Connection closed.") def process_request(self, request): """ This is where you integrate with LangChain. Example: """ # Example: Assume the request is a question for a LangChain chain. # Replace this with your actual LangChain setup. try: # Assuming request is a JSON string request_data = json.loads(request) question = request_data.get("question") if question: # **LangChain Integration:** # Replace this with your actual LangChain chain execution. # result = your_langchain_chain.run(question) result = f"LangChain processed: {question}" # Placeholder response_data = {"answer": result} response = json.dumps(response_data) else: response = "Error: No 'question' field in request." except json.JSONDecodeError: response = "Error: Invalid JSON format." except Exception as e: response = f"Error processing request: {e}" return response def stop(self): self.running = False if self.server_socket: self.server_socket.close() print("Server stopped.") # --- Client --- class MCPClient: def __init__(self, host='localhost', port=12345): self.host = host self.port = port self.client_socket = None def connect(self): self.client_socket = socket.socket(socket.AF_INET, socket.SOCK_STREAM) try: self.client_socket.connect((self.host, self.port)) print(f"Connected to {self.host}:{self.port}") except ConnectionRefusedError: print("Connection refused. Is the server running?") return False return True def send_message(self, message): try: self.client_socket.sendall(message.encode('utf-8')) data = self.client_socket.recv(1024).decode('utf-8') print(f"Received: {data}") return data except Exception as e: print(f"Error sending/receiving data: {e}") return None def close(self): if self.client_socket: self.client_socket.close() print("Connection closed.") # --- Example Usage --- if __name__ == "__main__": # Start the server in a separate thread server = MCPServer() server_thread = threading.Thread(target=server.start) server_thread.daemon = True # Allow the main thread to exit even if the server thread is running server_thread.start() # Give the server a moment to start import time time.sleep(0.5) # Create a client and connect client = MCPClient() if client.connect(): # Send a message message = json.dumps({"question": "What is the capital of France?"}) response = client.send_message(message) if response: print(f"Server response: {response}") # Close the connection client.close() # Stop the server (after a delay) time.sleep(2) server.stop() ``` **Explanation:** 1. **`MCPServer` Class:** * `__init__`: Initializes the server with a host and port. * `start`: Creates a socket, binds it to the address, and listens for incoming connections. It then enters a loop, accepting connections and spawning a new thread for each client. * `handle_client`: Handles communication with a single client. It receives data, calls `process_request` to handle the request (and integrate with LangChain), and sends the response back to the client. * `process_request`: **This is the key part for LangChain integration.** It receives the request data, parses it (in this example, assuming JSON), and then uses LangChain to process the request. The result from LangChain is then formatted into a response and returned. **You'll need to replace the placeholder code with your actual LangChain chain/agent execution.** * `stop`: Stops the server by closing the socket. 2. **`MCPClient` Class:** * `__init__`: Initializes the client with a host and port. * `connect`: Creates a socket and connects to the server. * `send_message`: Sends a message to the server and receives the response. * `close`: Closes the connection. 3. **Example Usage (`if __name__ == "__main__":`)** * Starts the server in a separate thread. This is important because the `server.start()` method is blocking (it waits for connections). * Creates a client, connects to the server, sends a message, receives the response, and closes the connection. * Stops the server after a short delay. **How to Adapt for LangChain:** 1. **Install LangChain:** `pip install langchain` 2. **Import LangChain Modules:** Import the necessary LangChain modules in your `process_request` function (e.g., `from langchain.chains import LLMChain`, `from langchain.llms import OpenAI`). 3. **Initialize LangChain Components:** Initialize your LangChain models, chains, agents, and memory in the `process_request` function (or, ideally, initialize them once when the server starts and pass them to `process_request`). 4. **Replace Placeholder Code:** Replace the placeholder code in the `process_request` function with your actual LangChain chain/agent execution. For example: ```python # Example using a LangChain LLMChain from langchain.chains import LLMChain from langchain.llms import OpenAI from langchain.prompts import PromptTemplate def process_request(self, request): try: request_data = json.loads(request) question = request_data.get("question") if question: # **LangChain Integration:** llm = OpenAI(temperature=0.7) # Replace with your LLM prompt = PromptTemplate( input_variables=["question"], template="Answer the following question: {question}" ) chain = LLMChain(llm=llm, prompt=prompt) result = chain.run(question) response_data = {"answer": result} response = json.dumps(response_data) else: response = "Error: No 'question' field in request." except json.JSONDecodeError: response = "Error: Invalid JSON format." except Exception as e: response = f"Error processing request: {e}" return response ``` **To run this example:** 1. Save the code as a Python file (e.g., `mcp_example.py`). 2. Run the file from your terminal: `python mcp_example.py` This will start the server and client in the same process. The client will send a message to the server, and the server will respond. Remember to adapt the `process_request` function to your specific LangChain use case. **Japanese Translation of Key Concepts:** * **MCP (Message Passing Communication):** メッセージパッシング通信 (Messēji Passhingu Tsūshin) * **Server:** サーバー (Sābā) * **Client:** クライアント (Kurainto) * **Socket:** ソケット (Soketto) * **Connection:** 接続 (Setsuzoku) * **Request:** リクエスト (Rikuesuto) / 要求 (Yōkyū) * **Response:** レスポンス (Resuponsu) / 応答 (Ōtō) * **LangChain:** LangChain (ラングチェイン) (Usually kept in English) * **Thread:** スレッド (Sureddo) * **Serialization:** シリアライズ (Shiriaraizu) / 直列化 (Chokuretsuka) * **Deserialization:** デシリアライズ (Deshiriaraizu) / 逆直列化 (Gyaku Chokuretsuka) * **Error Handling:** エラー処理 (Erā Shori) * **Asynchronous:** 非同期 (Hidoōki) This comprehensive example should give you a solid foundation for building your MCP server and client with LangChain integration. Remember to prioritize error handling, security, and a well-defined MCP protocol for a robust and reliable system. Good luck!
Hurricane Tracker MCP Server
Provides real-time hurricane tracking, 5-day forecast cones, location-based alerts, and historical storm data from NOAA/NHC through MCP tools for AI assistants.
Sequential Questioning MCP Server
A specialized server that enables LLMs to gather specific information through sequential questioning, implementing the MCP standard for seamless integration with LLM clients.
arc-mcp
An MCP server for the Arc browser that enables programmatic management of spaces and tabs. It supports actions like listing, creating, and deleting spaces and tabs, as well as focusing spaces and opening URLs via AppleScript.
Protein MCP Server
Enables searching, retrieving, and downloading protein structure data from the RCSB Protein Data Bank. Supports intelligent protein structure search, comprehensive data retrieval, and multiple file format downloads for bioinformatics research.
Google Drive HPC Log Analyzer
Enables Claude to directly access Google Drive files and analyze High Performance Computing (HPC) log files. Automatically detects common HPC issues like memory errors, job timeouts, and resource allocation problems.
Metasploit MCP Server
Bridges large language models with the Metasploit Framework to enable natural language control over penetration testing workflows. It provides tools for searching modules, executing exploits, generating payloads, and managing active sessions.
macOS GUI Control MCP
Enables comprehensive control over macOS GUI elements including mouse, keyboard, window management, and screen capture. It allows for automated system interactions and script execution while maintaining safety by blocking potentially destructive deletion commands.
Fabric MCP Server
Provides access to Daniel Miessler's Fabric AI prompts (patterns and strategies) through MCP, automatically syncing with the upstream repository to enable powerful prompt templates in AI workflows.
MyWeight MCP Server
A server that connects to the Health Planet API to fetch and provide weight measurement data through any MCP-compatible client, allowing for retrieval and analysis of personal weight records.
Uber External Ads API MCP Server
Enables users to manage Uber advertising campaigns through natural language by providing access to Uber's External Ads API. Supports campaign creation, retrieval, updating, and deletion with comprehensive filtering and configuration options.
Browse Together MCP
コードを編集する際に、AIと共同でブラウジングできます。ヘッド付きのPlaywright制御ブラウザと、それに付随するMCPサーバーです。
Aseprite MCP Tools
Aseprite API とやり取りするための MCP サーバー
NocoDB MCP Server
Enables direct integration with NocoDB databases from Cursor IDE, providing complete CRUD operations, search capabilities, and specialized tools for Discord workflow automation. Features production-ready deployment with Docker support and comprehensive monitoring.