Discover Awesome MCP Servers
Extend your agent with 16,900 capabilities via MCP servers.
- All16,900
- Developer Tools3,867
- Search1,714
- Research & Data1,557
- AI Integration Systems229
- Cloud Platforms219
- Data & App Analysis181
- Database Interaction177
- Remote Shell Execution165
- Browser Automation147
- Databases145
- Communication137
- AI Content Generation127
- OS Automation120
- Programming Docs Access109
- Content Fetching108
- Note Taking97
- File Systems96
- Version Control93
- Finance91
- Knowledge & Memory90
- Monitoring79
- Security71
- Image & Video Processing69
- Digital Note Management66
- AI Memory Systems62
- Advanced AI Reasoning59
- Git Management Tools58
- Cloud Storage51
- Entertainment & Media43
- Virtualization42
- Location Services35
- Web Automation & Stealth32
- Media Content Processing32
- Calendar Management26
- Ecommerce & Retail18
- Speech Processing18
- Customer Data Platforms16
- Travel & Transportation14
- Education & Learning Tools13
- Home Automation & IoT13
- Web Search Integration12
- Health & Wellness10
- Customer Support10
- Marketing9
- Games & Gamification8
- Google Cloud Integrations7
- Art & Culture4
- Language Translation3
- Legal & Compliance2
@depthark/css-first
This server integrates with Mozilla Developer Network (MDN) documentation to suggest CSS properties, check browser support, and provide implementation guidance with user consent mechanisms.
Model Context Protocol servers
3D Asset Processing MCP
Enables processing, validation, optimization, and analysis of 3D models with glTF/GLB support, including format conversion, compression (Meshopt/Draco), texture optimization, and detailed model statistics.
News MCP Server
Aggregates news from 7 APIs and unlimited RSS feeds with AI-powered bias removal and synthesis. Provides over 7,300 free daily requests with conversation-aware caching and 25 comprehensive news analysis tools.
MCP Cheat Engine Server
Provides safe, read-only access to memory analysis and debugging functionality through the Model Context Protocol, allowing users to examine computer memory for software development, security research, and educational purposes.
Openfort MCP Server
Enables AI assistants to interact with Openfort's wallet infrastructure, allowing them to create projects, manage configurations, generate wallets and users, and query documentation through 42 integrated tools.
面试鸭 MCP Server
Dịch sang tiếng Việt: **Dịch vụ MCP Server cho câu hỏi tìm kiếm của Vit vịt phỏng vấn dựa trên Spring AI, giúp AI nhanh chóng tìm kiếm các câu hỏi và câu trả lời phỏng vấn thực tế của doanh nghiệp.**
Pearch
This project provides a tool for searching people using the Pearch.ai, implemented as a FastMCP service.
MCPStudio: The Postman for Model Context Protocol
"Postman cho các máy chủ MCP"
Aws Sample Gen Ai Mcp Server
Okay, I can provide you with a sample code structure and explanation for using Gen-AI (specifically, Bedrock) with an MCP (Message Control Protocol) server in Python. This will be a conceptual outline, as a fully functional example would require specific details about your MCP server implementation and the exact Bedrock models you intend to use. **Conceptual Overview** The basic idea is: 1. **MCP Server:** This server receives requests (likely text prompts) from clients. 2. **Bedrock Integration:** The server takes the prompt, sends it to Amazon Bedrock, and receives a generated response. 3. **Response to Client:** The server sends the generated response back to the client via the MCP protocol. **Python Code Structure (Illustrative)** ```python import socket import json import boto3 # AWS SDK for Python (Boto3) # Configuration (Replace with your actual values) MCP_HOST = 'localhost' # Or your server's IP address MCP_PORT = 12345 # Or your server's port AWS_REGION = 'us-east-1' # Or your AWS region BEDROCK_MODEL_ID = 'anthropic.claude-v2' # Example model ID (replace) # Initialize Bedrock client bedrock = boto3.client(service_name='bedrock-runtime', region_name=AWS_REGION) def generate_text_with_bedrock(prompt, model_id=BEDROCK_MODEL_ID): """ Sends a prompt to Amazon Bedrock and returns the generated text. """ try: # Construct the request body (format depends on the model) if model_id.startswith("anthropic"): body = json.dumps({ "prompt": prompt, "max_tokens_to_sample": 200, # Adjust as needed "temperature": 0.5, # Adjust as needed "top_p": 0.9 }) content_type = 'application/json' accept = 'application/json' elif model_id.startswith("ai21"): body = json.dumps({ "prompt": prompt, "maxTokens": 200, "temperature": 0.7, "topP": 1 }) content_type = 'application/json' accept = 'application/json' else: raise ValueError(f"Unsupported model ID: {model_id}") response = bedrock.invoke_model( modelId=model_id, contentType=content_type, accept=accept, body=body ) response_body = json.loads(response['body'].read().decode('utf-8')) # Extract the generated text (depends on the model's response format) if model_id.startswith("anthropic"): generated_text = response_body['completion'] elif model_id.startswith("ai21"): generated_text = response_body['completions'][0]['data']['text'] else: raise ValueError(f"Unsupported model ID: {model_id}") return generated_text except Exception as e: print(f"Error generating text: {e}") return None def handle_client(client_socket): """ Handles communication with a single client. """ try: # Receive data from the client (assuming MCP format) data = client_socket.recv(1024).decode('utf-8') # Adjust buffer size as needed if not data: print("Client disconnected.") return # Parse the MCP message (assuming a simple JSON format for the prompt) try: message = json.loads(data) prompt = message.get('prompt') # Assuming the prompt is in a 'prompt' field model_id = message.get('model_id', BEDROCK_MODEL_ID) # Allow client to specify model except json.JSONDecodeError: print("Invalid JSON received.") client_socket.sendall("ERROR: Invalid JSON".encode('utf-8')) return if not prompt: print("No prompt received.") client_socket.sendall("ERROR: No prompt provided".encode('utf-8')) return # Generate text using Bedrock generated_text = generate_text_with_bedrock(prompt, model_id) if generated_text: # Construct the response message (MCP format) response_message = json.dumps({'response': generated_text}) client_socket.sendall(response_message.encode('utf-8')) else: client_socket.sendall("ERROR: Text generation failed".encode('utf-8')) except Exception as e: print(f"Error handling client: {e}") client_socket.sendall("ERROR: Internal server error".encode('utf-8')) finally: client_socket.close() def start_mcp_server(): """ Starts the MCP server. """ server_socket = socket.socket(socket.AF_INET, socket.SOCK_STREAM) server_socket.bind((MCP_HOST, MCP_PORT)) server_socket.listen(5) # Allow up to 5 pending connections print(f"MCP server listening on {MCP_HOST}:{MCP_PORT}") try: while True: client_socket, client_address = server_socket.accept() print(f"Accepted connection from {client_address}") # Handle the client in a separate thread or process (recommended for concurrency) # For simplicity, I'm calling it directly here, but threading/multiprocessing is better handle_client(client_socket) except KeyboardInterrupt: print("Server shutting down.") finally: server_socket.close() if __name__ == "__main__": start_mcp_server() ``` **Explanation and Vietnamese Translation** ```python import socket # Thư viện cho giao tiếp mạng socket import json # Thư viện để làm việc với dữ liệu JSON import boto3 # AWS SDK cho Python (Boto3) để tương tác với Bedrock # Cấu hình (Thay thế bằng giá trị thực tế của bạn) MCP_HOST = 'localhost' # Hoặc địa chỉ IP của máy chủ MCP của bạn MCP_PORT = 12345 # Hoặc cổng của máy chủ MCP của bạn AWS_REGION = 'us-east-1' # Hoặc khu vực AWS của bạn BEDROCK_MODEL_ID = 'anthropic.claude-v2' # ID của mô hình Bedrock (ví dụ, thay thế bằng mô hình bạn muốn dùng) # Khởi tạo client Bedrock bedrock = boto3.client(service_name='bedrock-runtime', region_name=AWS_REGION) def generate_text_with_bedrock(prompt, model_id=BEDROCK_MODEL_ID): """ Gửi một prompt đến Amazon Bedrock và trả về văn bản được tạo ra. """ try: # Xây dựng phần thân của yêu cầu (định dạng phụ thuộc vào mô hình) if model_id.startswith("anthropic"): body = json.dumps({ "prompt": prompt, "max_tokens_to_sample": 200, # Điều chỉnh nếu cần "temperature": 0.5, # Điều chỉnh nếu cần (độ ngẫu nhiên) "top_p": 0.9 # Điều chỉnh nếu cần (xác suất) }) content_type = 'application/json' accept = 'application/json' elif model_id.startswith("ai21"): body = json.dumps({ "prompt": prompt, "maxTokens": 200, "temperature": 0.7, "topP": 1 }) content_type = 'application/json' accept = 'application/json' else: raise ValueError(f"Unsupported model ID: {model_id}") response = bedrock.invoke_model( modelId=model_id, contentType=content_type, accept=accept, body=body ) response_body = json.loads(response['body'].read().decode('utf-8')) # Trích xuất văn bản được tạo ra (phụ thuộc vào định dạng phản hồi của mô hình) if model_id.startswith("anthropic"): generated_text = response_body['completion'] elif model_id.startswith("ai21"): generated_text = response_body['completions'][0]['data']['text'] else: raise ValueError(f"Unsupported model ID: {model_id}") return generated_text except Exception as e: print(f"Lỗi khi tạo văn bản: {e}") return None def handle_client(client_socket): """ Xử lý giao tiếp với một client duy nhất. """ try: # Nhận dữ liệu từ client (giả sử định dạng MCP) data = client_socket.recv(1024).decode('utf-8') # Điều chỉnh kích thước buffer nếu cần if not data: print("Client đã ngắt kết nối.") return # Phân tích cú pháp thông điệp MCP (giả sử định dạng JSON đơn giản cho prompt) try: message = json.loads(data) prompt = message.get('prompt') # Giả sử prompt nằm trong trường 'prompt' model_id = message.get('model_id', BEDROCK_MODEL_ID) # Cho phép client chỉ định model except json.JSONDecodeError: print("JSON không hợp lệ đã nhận.") client_socket.sendall("ERROR: Invalid JSON".encode('utf-8')) return if not prompt: print("Không nhận được prompt.") client_socket.sendall("ERROR: No prompt provided".encode('utf-8')) return # Tạo văn bản bằng Bedrock generated_text = generate_text_with_bedrock(prompt, model_id) if generated_text: # Xây dựng thông điệp phản hồi (định dạng MCP) response_message = json.dumps({'response': generated_text}) client_socket.sendall(response_message.encode('utf-8')) else: client_socket.sendall("ERROR: Text generation failed".encode('utf-8')) except Exception as e: print(f"Lỗi khi xử lý client: {e}") client_socket.sendall("ERROR: Internal server error".encode('utf-8')) finally: client_socket.close() def start_mcp_server(): """ Khởi động máy chủ MCP. """ server_socket = socket.socket(socket.AF_INET, socket.SOCK_STREAM) server_socket.bind((MCP_HOST, MCP_PORT)) server_socket.listen(5) # Cho phép tối đa 5 kết nối đang chờ print(f"Máy chủ MCP đang lắng nghe trên {MCP_HOST}:{MCP_PORT}") try: while True: client_socket, client_address = server_socket.accept() print(f"Đã chấp nhận kết nối từ {client_address}") # Xử lý client trong một thread hoặc process riêng biệt (khuyến nghị để đồng thời) # Để đơn giản, tôi gọi trực tiếp ở đây, nhưng threading/multiprocessing thì tốt hơn handle_client(client_socket) except KeyboardInterrupt: print("Máy chủ đang tắt.") finally: server_socket.close() if __name__ == "__main__": start_mcp_server() ``` **Key Points and Considerations** * **Error Handling:** The code includes basic error handling, but you should expand it to be more robust. Consider logging errors to a file. * **Concurrency:** The `handle_client` function is called directly in the `start_mcp_server` loop. This means the server can only handle one client at a time. Use threads or processes to handle multiple clients concurrently. The `threading` or `multiprocessing` modules in Python are suitable for this. * **MCP Protocol:** This example assumes a very simple JSON-based MCP protocol. You'll need to adapt the `handle_client` function to correctly parse and format messages according to your actual MCP protocol. Consider using a dedicated MCP library if one exists. * **Bedrock Model Parameters:** The `generate_text_with_bedrock` function includes parameters like `max_tokens_to_sample`, `temperature`, and `top_p`. These parameters control the generation process. Experiment with different values to get the desired results from your chosen Bedrock model. The specific parameters available and their meanings vary depending on the model. Refer to the Bedrock documentation for the model you are using. * **Authentication and Authorization:** This example doesn't include any authentication or authorization. In a real-world application, you'll need to implement security measures to protect your Bedrock API key and prevent unauthorized access to your server. Consider using IAM roles and policies for Bedrock access. * **Model ID:** Make sure the `BEDROCK_MODEL_ID` is correct and that you have access to that model in your AWS account and region. * **Region:** Ensure the `AWS_REGION` is the region where you have access to Bedrock. * **Dependencies:** You'll need to install the `boto3` library: `pip install boto3` **How to Use** 1. **Install Boto3:** `pip install boto3` 2. **Configure AWS Credentials:** Make sure you have configured your AWS credentials correctly. The easiest way is to configure the AWS CLI: `aws configure`. You'll need an AWS account and an IAM user with permissions to access Bedrock. 3. **Replace Placeholders:** Update the `MCP_HOST`, `MCP_PORT`, `AWS_REGION`, and `BEDROCK_MODEL_ID` variables with your actual values. 4. **Run the Server:** Execute the Python script. 5. **Create a Client:** Write a client application that connects to the MCP server and sends JSON messages with a "prompt" field. **Example Client (Python)** ```python import socket import json MCP_HOST = 'localhost' MCP_PORT = 12345 def send_prompt(prompt): client_socket = socket.socket(socket.AF_INET, socket.SOCK_STREAM) client_socket.connect((MCP_HOST, MCP_PORT)) message = json.dumps({'prompt': prompt}) client_socket.sendall(message.encode('utf-8')) response = client_socket.recv(1024).decode('utf-8') client_socket.close() return response if __name__ == "__main__": user_prompt = "Write a short story about a cat who goes on an adventure." response = send_prompt(user_prompt) print(f"Response from server: {response}") ``` **Important Disclaimer:** This is a simplified example. Building a production-ready system requires careful consideration of security, scalability, error handling, and the specific requirements of your application. Always consult the official documentation for Amazon Bedrock and the MCP protocol you are using.
MCP Weather Server
A Model Context Protocol server that provides tools to fetch weather alerts for US states and forecasts based on latitude/longitude coordinates using the US National Weather Service API.
PDFSizeAnalyzer-MCP
Enables comprehensive PDF analysis and manipulation including page size analysis, chapter extraction, splitting, compression, merging, and conversion to images. Provides both MCP server interface for AI assistants and Streamlit web interface for direct user interaction.
Open States API MCP Server
This MCP server enables interaction with the Open States API, allowing users to access legislative data from US state governments through natural language commands.
Inoreader MCP Server
Enables intelligent RSS feed management and analysis through Inoreader integration. Supports reading articles, search, bulk operations, and AI-powered content analysis including summarization, trend analysis, and sentiment analysis.
database-updater MCP Server
Gương của
mcp-workflowy
mcp-workflowy
Berghain Events MCP Server
A server that allows AI agents to query and retrieve information about upcoming events at Berghain nightclub through a DynamoDB-backed FastAPI service.
GitHub Integration Hub
Enables AI agents to interact with GitHub through OAuth-authenticated operations including starting authorization flows, listing repositories, and creating issues using stored access tokens.
Shopify MCP Server by CData
Shopify MCP Server by CData
Cloud Translation API MCP Server
An MCP (Multi-Agent Conversation Protocol) Server that enables AI agents to interact with Google's Cloud Translation API for translating text between languages.
Context7 MCP Server
Máy chủ Context7 MCP
MCP Unity Bridge Asset
Asset to be imported into Unity to host a WebSocket server for MCP Conmmunciation with LLMs
Crypto Trader MCP Tool
Provides cryptocurrency market data using the CoinGecko API
PDF Redaction MCP Server
Enables loading, reviewing, and redacting sensitive content in PDF documents through text-based or area-based redaction methods. Supports customizable redaction appearance and saves redacted PDFs with comprehensive error handling.
Mem0 MCP Server
Provides long-term memory capabilities for MCP clients by wrapping the Mem0 API, enabling semantic search, storage, retrieval, and management of conversation memories across users and agents.
MCP Servers
Máy chủ MCP cho công việc phát triển
browser-mcp
A MCP server that allows AI assistants to interact with the browser, including getting page content as markdown, modifying page styles, and searching browser history.
FigmaMind MCP Server
Extracts components from Figma designs and transforms them into standardized JSON format for easy consumption by AI models and tools for interface reconstruction.
Roam Research MCP Server
A server that enables AI assistants like Claude to interact with Roam Research graphs through a standardized interface, providing comprehensive tools for content creation, search, retrieval, and optional memory management.
Yes or No MCP
A simple MCP server implementation in TypeScript that communicates over stdio, allowing users to ask questions that end with 'yes or no' to trigger the MCP tool in Cursor.