Discover Awesome MCP Servers

Extend your agent with 19,640 capabilities via MCP servers.

All19,640
Claude Jester MCP

Claude Jester MCP

Transforms Claude from a code generator into a programming partner capable of testing, debugging, and optimizing code automatically through a secure execution environment.

@depthark/css-first

@depthark/css-first

This server integrates with Mozilla Developer Network (MDN) documentation to suggest CSS properties, check browser support, and provide implementation guidance with user consent mechanisms.

Model Context Protocol servers

Model Context Protocol servers

MCP Cheat Engine Server

MCP Cheat Engine Server

Provides safe, read-only access to memory analysis and debugging functionality through the Model Context Protocol, allowing users to examine computer memory for software development, security research, and educational purposes.

Openfort MCP Server

Openfort MCP Server

Enables AI assistants to interact with Openfort's wallet infrastructure, allowing them to create projects, manage configurations, generate wallets and users, and query documentation through 42 integrated tools.

面试鸭 MCP Server

面试鸭 MCP Server

Spring AI 기반 면접 오리 검색 문제의 MCP 서버 서비스로, AI가 기업 면접 실제 문제와 답변을 빠르게 검색할 수 있도록 합니다.

Pearch

Pearch

This project provides a tool for searching people using the Pearch.ai, implemented as a FastMCP service.

MCPStudio: The Postman for Model Context Protocol

MCPStudio: The Postman for Model Context Protocol

MCP 서버용 Postman

Aws Sample Gen Ai Mcp Server

Aws Sample Gen Ai Mcp Server

Okay, here's a sample code snippet demonstrating how to use Gen-AI (specifically, Bedrock) with an MCP (presumably, a message processing or communication platform) server. I'll provide a Python example, as it's commonly used for both Gen-AI and server-side scripting. I'll make some assumptions about your MCP server's API, but you'll need to adapt it to your specific setup. **Conceptual Overview** 1. **Receive Message:** The MCP server receives a message (e.g., from a user). 2. **Extract Text:** The relevant text from the message is extracted. 3. **Send to Bedrock:** The text is sent to Amazon Bedrock for processing (e.g., text generation, summarization, translation). 4. **Receive Response:** Bedrock returns a response. 5. **Format Response:** The response is formatted for sending back to the user. 6. **Send to MCP Server:** The formatted response is sent back to the MCP server to be delivered to the user. **Python Example (using Boto3 for Bedrock)** ```python import boto3 import json from flask import Flask, request, jsonify # Example using Flask for MCP server app = Flask(__name__) # Configure AWS Bedrock bedrock = boto3.client( service_name='bedrock-runtime', region_name='your-aws-region', # Replace with your AWS region (e.g., 'us-east-1') endpoint_url='your-bedrock-endpoint-url' # Replace with your Bedrock endpoint URL if needed ) model_id = 'anthropic.claude-v2' # Example model ID. Change as needed. # MCP Server Endpoint (Example using Flask) @app.route('/process_message', methods=['POST']) def process_message(): try: data = request.get_json() user_message = data.get('message') # Assuming the message is in a 'message' field if not user_message: return jsonify({'error': 'No message provided'}), 400 # Call Bedrock bedrock_response = invoke_bedrock(user_message) # Format the response (adapt to your MCP server's requirements) formatted_response = { 'response': bedrock_response, 'original_message': user_message } # Send the response back to the MCP server (in this example, we just return it) return jsonify(formatted_response), 200 except Exception as e: print(f"Error processing message: {e}") return jsonify({'error': str(e)}), 500 def invoke_bedrock(prompt): """Invokes the Bedrock model with the given prompt.""" body = json.dumps({ "prompt": prompt, "max_tokens_to_sample": 200, # Adjust as needed "temperature": 0.5, # Adjust as needed "top_p": 0.9 # Adjust as needed }) try: response = bedrock.invoke_model( modelId=model_id, contentType="application/json", accept="application/json", body=body ) response_body = json.loads(response['body'].read()) completion = response_body['completion'] return completion except Exception as e: print(f"Error invoking Bedrock: {e}") return f"Error: {e}" if __name__ == '__main__': app.run(debug=True, port=5000) # Adjust port as needed ``` **Explanation:** 1. **Imports:** Imports necessary libraries (boto3 for Bedrock, Flask for the MCP server example, json for data handling). 2. **Bedrock Configuration:** * `boto3.client('bedrock-runtime', ...)`: Creates a Bedrock client. **Crucially, replace `'your-aws-region'` with your AWS region (e.g., `'us-east-1'`) and `'your-bedrock-endpoint-url'` with your Bedrock endpoint URL if needed.** The endpoint URL is usually only needed if you're using a custom endpoint or a specific Bedrock feature. * `model_id`: Specifies the Bedrock model to use (e.g., `'anthropic.claude-v2'`). **Change this to the model you want to use.** See the Bedrock documentation for available models. 3. **MCP Server (Flask Example):** * `@app.route('/process_message', methods=['POST'])`: Defines a route that listens for POST requests at `/process_message`. This simulates your MCP server receiving a message. * `request.get_json()`: Parses the JSON data from the incoming request. **Adapt this to how your MCP server sends data.** * `user_message = data.get('message')`: Extracts the user's message from the JSON data. **Adjust the key (`'message'`) to match the structure of your MCP server's messages.** 4. **`invoke_bedrock(prompt)` Function:** * Constructs the JSON payload for the Bedrock API. The payload includes: * `prompt`: The user's message (the text you want Bedrock to process). * `max_tokens_to_sample`: The maximum number of tokens Bedrock should generate in its response. Adjust this based on the expected length of the response. * `temperature`: Controls the randomness of the output. Higher values (e.g., 0.7) produce more random and creative results. Lower values (e.g., 0.2) produce more predictable results. * `top_p`: Another parameter that controls randomness. It's often used in conjunction with `temperature`. * `bedrock.invoke_model(...)`: Calls the Bedrock API. * Parses the JSON response from Bedrock and extracts the `completion` (the generated text). * Includes error handling. 5. **Response Formatting:** * `formatted_response`: Creates a dictionary containing the Bedrock response and the original message. **Adapt this to the format your MCP server expects.** You might need to include user IDs, timestamps, or other metadata. 6. **Sending the Response:** * `return jsonify(formatted_response), 200`: Returns the formatted response as JSON. **In a real MCP server, you would send this response to the appropriate API endpoint to deliver it to the user.** This might involve using a different library (e.g., `requests` to make an HTTP request to your MCP server). 7. **Error Handling:** The code includes `try...except` blocks to catch potential errors (e.g., network issues, invalid JSON). **To Run This Example:** 1. **Install Dependencies:** ```bash pip install boto3 flask ``` 2. **Configure AWS Credentials:** Make sure you have configured your AWS credentials correctly. The easiest way is usually to use the AWS CLI: ```bash aws configure ``` You'll need an AWS account with access to Bedrock. You might need to request access to specific models in the Bedrock console. 3. **Replace Placeholders:** Replace the placeholder values for `your-aws-region`, `your-bedrock-endpoint-url`, and `model_id`. 4. **Run the Script:** ```bash python your_script_name.py ``` 5. **Test the Endpoint:** You can test the endpoint using `curl` or a tool like Postman: ```bash curl -X POST -H "Content-Type: application/json" -d '{"message": "Tell me a short story about a cat."}' http://localhost:5000/process_message ``` **Important Considerations:** * **Security:** If you're deploying this to a production environment, make sure to secure your MCP server and Bedrock API calls. Use appropriate authentication and authorization mechanisms. * **Error Handling:** Implement robust error handling to gracefully handle failures and provide informative error messages. * **Rate Limiting:** Be aware of Bedrock's rate limits and implement appropriate rate limiting in your code to avoid being throttled. * **Cost:** Bedrock usage can incur costs. Monitor your usage and set budgets to avoid unexpected charges. * **Asynchronous Processing:** For high-volume applications, consider using asynchronous processing (e.g., with Celery or AWS SQS) to avoid blocking the MCP server while waiting for Bedrock responses. * **MCP Server Integration:** The most important part is adapting the code to your specific MCP server's API. You'll need to understand how to receive messages from the server, how to format responses, and how to send responses back to the server. **Korean Translation (of the important parts):** * **`your-aws-region`**: 본인의 AWS 리전으로 변경하세요 (예: `us-east-1`). * **`your-bedrock-endpoint-url`**: 필요하다면 본인의 Bedrock 엔드포인트 URL로 변경하세요. * **`model_id`**: 사용하고 싶은 Bedrock 모델 ID로 변경하세요 (예: `anthropic.claude-v2`). Bedrock 문서에서 사용 가능한 모델을 확인하세요. * **MCP 서버 API에 맞게 코드를 조정해야 합니다.** 메시지를 받는 방법, 응답을 포맷하는 방법, 응답을 다시 서버로 보내는 방법을 이해해야 합니다. * **보안:** 프로덕션 환경에 배포하는 경우 MCP 서버와 Bedrock API 호출을 안전하게 보호해야 합니다. 적절한 인증 및 권한 부여 메커니즘을 사용하십시오. * **요금:** Bedrock 사용에는 비용이 발생할 수 있습니다. 사용량을 모니터링하고 예기치 않은 요금이 발생하지 않도록 예산을 설정하십시오. This comprehensive example should give you a solid starting point for integrating Gen-AI (Bedrock) with your MCP server. Remember to adapt the code to your specific requirements and environment. Good luck!

MCP Weather Server

MCP Weather Server

A Model Context Protocol server that provides tools to fetch weather alerts for US states and forecasts based on latitude/longitude coordinates using the US National Weather Service API.

PDFSizeAnalyzer-MCP

PDFSizeAnalyzer-MCP

Enables comprehensive PDF analysis and manipulation including page size analysis, chapter extraction, splitting, compression, merging, and conversion to images. Provides both MCP server interface for AI assistants and Streamlit web interface for direct user interaction.

Open States API MCP Server

Open States API MCP Server

This MCP server enables interaction with the Open States API, allowing users to access legislative data from US state governments through natural language commands.

Inoreader MCP Server

Inoreader MCP Server

Enables intelligent RSS feed management and analysis through Inoreader integration. Supports reading articles, search, bulk operations, and AI-powered content analysis including summarization, trend analysis, and sentiment analysis.

database-updater MCP Server

database-updater MCP Server

거울

mcp-workflowy

mcp-workflowy

mcp-workflowy

Berghain Events MCP Server

Berghain Events MCP Server

A server that allows AI agents to query and retrieve information about upcoming events at Berghain nightclub through a DynamoDB-backed FastAPI service.

GitHub Integration Hub

GitHub Integration Hub

Enables AI agents to interact with GitHub through OAuth-authenticated operations including starting authorization flows, listing repositories, and creating issues using stored access tokens.

Shopify MCP Server by CData

Shopify MCP Server by CData

Shopify MCP Server by CData

Cloud Translation API MCP Server

Cloud Translation API MCP Server

An MCP (Multi-Agent Conversation Protocol) Server that enables AI agents to interact with Google's Cloud Translation API for translating text between languages.

Context7 MCP Server

Context7 MCP Server

Context7 MCP 서버

MCP Unity Bridge Asset

MCP Unity Bridge Asset

Asset to be imported into Unity to host a WebSocket server for MCP Conmmunciation with LLMs

Crypto Trader MCP Tool

Crypto Trader MCP Tool

Provides cryptocurrency market data using the CoinGecko API

Yes or No MCP

Yes or No MCP

A simple MCP server implementation in TypeScript that communicates over stdio, allowing users to ask questions that end with 'yes or no' to trigger the MCP tool in Cursor.

MCP Server Examples

MCP Server Examples

MCP Server Examples: Built using pure MCP Java SDK – No Spring Framework Required

Google Slides MCP Server

Google Slides MCP Server

Enables interaction with Google Slides presentations through OAuth2 authentication. Supports creating new slides, adding rectangles, and managing presentation content through natural language commands.

MCP Product Management System

MCP Product Management System

A comprehensive Model Context Protocol (MCP) server for product inventory management with PostgreSQL database backend, enabling natural language queries for product information across multiple AI platforms.

macOS Tools MCP Server

macOS Tools MCP Server

Provides read-only access to native macOS system utilities including disk management, battery status, network configuration, and system profiling through terminal commands. Enables users to retrieve system information and diagnostics from macOS machines via standardized MCP tools.

Datastream MCP Server

Datastream MCP Server

A Multi-Agent Conversation Protocol server that enables interaction with Google Cloud Datastream API for managing data replication services between various source and destination systems through natural language commands.

Metasploit MCP Server

Metasploit MCP Server

Provides a bridge between large language models and the Metasploit Framework, enabling AI assistants to access and control penetration testing functionality through natural language.

Swiftcode MCP Server

Swiftcode MCP Server

A Model Context Protocol server that automates code generation for web development, specializing in creating TypeScript API clients from Swagger/OpenAPI specs and Vue.js components for frontend development.