Discover Awesome MCP Servers
Extend your agent with 24,162 capabilities via MCP servers.
- All24,162
- Developer Tools3,867
- Search1,714
- Research & Data1,557
- AI Integration Systems229
- Cloud Platforms219
- Data & App Analysis181
- Database Interaction177
- Remote Shell Execution165
- Browser Automation147
- Databases145
- Communication137
- AI Content Generation127
- OS Automation120
- Programming Docs Access109
- Content Fetching108
- Note Taking97
- File Systems96
- Version Control93
- Finance91
- Knowledge & Memory90
- Monitoring79
- Security71
- Image & Video Processing69
- Digital Note Management66
- AI Memory Systems62
- Advanced AI Reasoning59
- Git Management Tools58
- Cloud Storage51
- Entertainment & Media43
- Virtualization42
- Location Services35
- Web Automation & Stealth32
- Media Content Processing32
- Calendar Management26
- Ecommerce & Retail18
- Speech Processing18
- Customer Data Platforms16
- Travel & Transportation14
- Education & Learning Tools13
- Home Automation & IoT13
- Web Search Integration12
- Health & Wellness10
- Customer Support10
- Marketing9
- Games & Gamification8
- Google Cloud Integrations7
- Art & Culture4
- Language Translation3
- Legal & Compliance2
NotePlan MCP Server
A Message Control Protocol server that enables Claude Desktop to interact with NotePlan.co, allowing users to query, search, create, and update notes directly from Claude conversations.
TermPipe MCP
Provides AI assistants with direct terminal access to execute commands, manage files, and run persistent REPL sessions. It features automated installation scripts that educate AI assistants on its capabilities for seamless integration.
LiblibAI Picture Generator
Enables AI image generation through LiblibAI API with natural language prompts. Supports various art styles, real-time progress tracking, and account credit management.
Integrator MCP Server
A Model Context Protocol server that allows AI assistants to invoke and interact with Integrator automation workflows through an API connection.
sliverc2-mcp
sliverc2-mcp
Workday MCP Server by CData
Workday MCP Server by CData
QGISMCP
QGISをClaude AIにModel Context Protocol経由で接続し、AI支援によるプロジェクト作成、レイヤー操作、処理アルゴリズム実行、およびQGIS内でのPythonコード実行を可能にします。
Dummy MCP Server
A simple Meta-agent Communication Protocol server built with FastMCP framework that provides 'echo' and 'dummy' tools via Server-Sent Events for demonstration and testing purposes.
Mcp
これはMCPサーバーをテストするための趣味です。
Google Sheets API MCP Server
Odoo MCP Server
Enables AI assistants to interact with Odoo ERP systems through XML-RPC communication. Provides access to Odoo models, records, methods, and data structures for comprehensive ERP integration.
Mem0 Memory MCP Server
Enables AI agents to store and retrieve memories with user-specific context using Mem0, allowing them to maintain conversation history and make informed decisions based on past interactions.
News MCP Server
Aggregates news from 7 APIs and unlimited RSS feeds with AI-powered bias removal and synthesis. Provides over 7,300 free daily requests with conversation-aware caching and 25 comprehensive news analysis tools.
Flutter Package MCP Server
Integrates the Pub.dev API with AI assistants to provide real-time Flutter package information, documentation, and trend analysis. It enables users to search for packages, compare versions, and evaluate quality scores through natural language commands.
Spotify MCP Server
Enables interaction with Spotify through natural language for music discovery, playback control, library management, and playlist creation. Supports searching for music, controlling playback, managing saved tracks, and getting personalized recommendations based on mood and preferences.
Openfort MCP Server
Enables AI assistants to interact with Openfort's wallet infrastructure, allowing them to create projects, manage configurations, generate wallets and users, and query documentation through 42 integrated tools.
面试鸭 MCP Server
Spring AI をベースにした面接アヒル検索問題の MCP Server サービスで、AI が企業の面接本番問題と回答を迅速に検索できるようにします。
Zen MCP Server
Orchestrates multiple AI models (Gemini, OpenAI, Claude, local models) within a single conversation context, enabling collaborative workflows like multi-model code reviews, consensus building, and CLI-to-CLI bridging for specialized tasks.
OSINT MCP Server
Exposes popular OSINT and reconnaissance tools like Sherlock, SpiderFoot, and Holehe through MCP and HTTP APIs for AI assistants. Runs security research tools in sandboxed environments and returns normalized JSON results for investigation and analysis.
Somnia MCP Server
Enables interaction with Somnia blockchain data, providing tools to retrieve block information, token balances, transaction history, and NFT metadata through the ORMI API.
Aws Sample Gen Ai Mcp Server
Okay, here's a sample code snippet demonstrating how to use Gen-AI (Bedrock) with an MCP (Message Control Protocol) server. This example focuses on the core concepts and assumes you have the necessary libraries and configurations set up. It's a simplified illustration and will need adaptation based on your specific MCP server and Bedrock use case. **Conceptual Overview** 1. **MCP Server:** This acts as a central point for receiving requests. It could be a simple TCP server or a more sophisticated message queue system. The code below uses a basic TCP server for demonstration. 2. **Bedrock (Gen-AI):** This is where the AI model resides. You'll use the Bedrock API to send prompts and receive responses. 3. **Workflow:** * The MCP server receives a request (e.g., a text prompt). * The server forwards the prompt to Bedrock. * Bedrock processes the prompt and returns a response. * The server sends the response back to the client. **Python Example (using `socket` for MCP and `boto3` for Bedrock)** ```python import socket import boto3 import json # Configuration (replace with your actual values) MCP_HOST = 'localhost' # Or your MCP server's IP address MCP_PORT = 12345 # Or your MCP server's port BEDROCK_REGION = 'us-east-1' # Or your Bedrock region BEDROCK_MODEL_ID = 'anthropic.claude-v2' # Or your desired Bedrock model ID ACCEPTABLE_ORIGINS = ["localhost", "127.0.0.1"] # Add any other acceptable origins here # Initialize Bedrock client bedrock = boto3.client(service_name='bedrock-runtime', region_name=BEDROCK_REGION) def handle_request(client_socket, client_address): """Handles a single request from a client.""" try: data = client_socket.recv(1024).decode('utf-8') if not data: return # Client disconnected print(f"Received from {client_address}: {data}") # Check origin (very basic example - improve this for production!) try: request_json = json.loads(data) origin = request_json.get("origin", None) prompt = request_json.get("prompt", None) except json.JSONDecodeError: print("Invalid JSON received") client_socket.sendall("Invalid JSON".encode('utf-8')) return if origin not in ACCEPTABLE_ORIGINS: print(f"Request from unacceptable origin: {origin}") client_socket.sendall("Origin not allowed".encode('utf-8')) return if not prompt: print("No prompt provided") client_socket.sendall("No prompt provided".encode('utf-8')) return # Call Bedrock try: response = invoke_bedrock(prompt) client_socket.sendall(response.encode('utf-8')) except Exception as e: print(f"Bedrock error: {e}") client_socket.sendall(f"Bedrock error: {e}".encode('utf-8')) except Exception as e: print(f"Error handling request: {e}") finally: client_socket.close() def invoke_bedrock(prompt): """Invokes the Bedrock model with the given prompt.""" # Construct the request body (adjust based on the model) body = json.dumps({ "prompt": prompt, "max_tokens_to_sample": 200, # Adjust as needed "temperature": 0.5, # Adjust as needed "top_p": 0.9 # Adjust as needed }) try: response = bedrock.invoke_model( modelId=BEDROCK_MODEL_ID, contentType='application/json', accept='application/json', body=body ) response_body = json.loads(response['body'].read().decode('utf-8')) completion = response_body['completion'] # Adjust based on model's response format return completion except Exception as e: print(f"Error invoking Bedrock: {e}") return f"Error: {e}" def start_mcp_server(): """Starts the MCP server.""" server_socket = socket.socket(socket.AF_INET, socket.SOCK_STREAM) server_socket.bind((MCP_HOST, MCP_PORT)) server_socket.listen(5) # Listen for up to 5 incoming connections print(f"MCP server listening on {MCP_HOST}:{MCP_PORT}") while True: client_socket, client_address = server_socket.accept() print(f"Accepted connection from {client_address}") handle_request(client_socket, client_address) # Handle the request in a separate function if __name__ == "__main__": start_mcp_server() ``` **Explanation:** * **Imports:** Imports necessary libraries (`socket`, `boto3`, `json`). * **Configuration:** Sets up configuration variables for the MCP server address, port, Bedrock region, and model ID. **Crucially, replace these with your actual values.** * **`handle_request()`:** * Receives data from the client socket. * Decodes the data (assuming UTF-8 encoding). * **Important:** Includes a very basic origin check. **This is a placeholder and needs to be significantly improved for any production environment.** You should implement robust authentication and authorization. The example expects a JSON payload with `origin` and `prompt` fields. * Calls `invoke_bedrock()` to send the prompt to Bedrock. * Sends the response back to the client. * Handles potential errors. * Closes the client socket. * **`invoke_bedrock()`:** * Constructs the request body for the Bedrock API. **This is highly model-dependent.** The example shows a basic structure for Anthropic Claude. You'll need to consult the Bedrock documentation for the specific model you're using to determine the correct request format. * Calls the `bedrock.invoke_model()` method. * Parses the response from Bedrock. **Again, the response format is model-dependent.** The example assumes a `completion` field in the response. * Handles potential errors. * **`start_mcp_server()`:** * Creates a TCP socket. * Binds the socket to the specified host and port. * Listens for incoming connections. * Accepts connections in a loop. * Calls `handle_request()` to process each connection. * **`if __name__ == "__main__":`:** Starts the MCP server when the script is run. **How to Run:** 1. **Install Libraries:** ```bash pip install boto3 ``` 2. **Configure AWS Credentials:** Make sure you have configured your AWS credentials (e.g., using `aws configure` or environment variables) so that `boto3` can access Bedrock. The IAM role or user you're using must have permissions to invoke the Bedrock model. 3. **Replace Placeholders:** Update the configuration variables at the top of the script with your actual values. 4. **Run the Script:** ```bash python your_script_name.py ``` 5. **Test with a Client:** You'll need a client application to send requests to the MCP server. Here's a simple Python client example: ```python import socket import json MCP_HOST = 'localhost' MCP_PORT = 12345 def send_request(prompt, origin): """Sends a request to the MCP server.""" with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as s: s.connect((MCP_HOST, MCP_PORT)) message = json.dumps({"prompt": prompt, "origin": origin}) s.sendall(message.encode('utf-8')) data = s.recv(1024) print(f"Received: {data.decode('utf-8')}") if __name__ == "__main__": prompt = "Write a short poem about the ocean." origin = "localhost" # Or "127.0.0.1" send_request(prompt, origin) ``` **Important Considerations:** * **Error Handling:** The error handling in the example is basic. You should implement more robust error handling, including logging and retries. * **Security:** The origin check is extremely basic. For production environments, you *must* implement proper authentication and authorization to prevent unauthorized access. Consider using TLS/SSL for secure communication. * **Scalability:** For high-volume traffic, consider using a more scalable MCP server architecture, such as a message queue (e.g., RabbitMQ, Kafka) or a load balancer. You might also need to scale your Bedrock usage. * **Bedrock Model Configuration:** The `invoke_bedrock()` function needs to be carefully configured based on the specific Bedrock model you're using. Refer to the Bedrock documentation for the model's input and output formats, available parameters, and best practices. * **Asynchronous Processing:** For better performance, consider using asynchronous programming (e.g., `asyncio`) to handle multiple requests concurrently. * **Rate Limiting:** Be aware of Bedrock's rate limits and implement appropriate rate limiting in your MCP server to avoid exceeding those limits. * **Data Validation:** Validate the data received from clients to prevent malicious input. * **Logging:** Implement comprehensive logging to track requests, responses, and errors. This example provides a starting point. You'll need to adapt it to your specific requirements and environment. Remember to prioritize security, error handling, and scalability as you develop your application.
nix-mcp-servers
MCPサーバー用のNixパッケージリポジトリ
Brummer MCP Server
Allows external tools (VSCode, Claude Code, etc.) to access log output and errors, execute commands asynchronously, and monitor process status.
jlbloomer-mcp-server
A personal MCP server for AI assistant integration that provides custom tools, resources, and prompts for use with Claude Desktop and other MCP-compatible clients.
mcp-workflowy
mcp-workflowy
MCP Knowledge Base Server
Provides semantic search and data retrieval capabilities over a knowledge base with multiple tools including keyword search, category filtering, and ID-based lookup with in-memory caching.
Grafana UI MCP Server
Provides AI assistants with comprehensive access to Grafana's React component library, including TypeScript source code, MDX documentation, Storybook examples, test files, and design system tokens for building Grafana-compatible interfaces.
Shopify MCP Server by CData
Shopify MCP Server by CData
Doubao Image Description MCP Server
Enables visual understanding and image description capabilities for iFlow CLI and Claude Desktop using Doubao's vision model. It supports 18 image formats and features automatic optimization, caching, and Chinese language optimization.
MCP Integration Servers
MCPサーバー (MCP sābā)