Discover Awesome MCP Servers

Extend your agent with 16,916 capabilities via MCP servers.

All16,916
Main

Main

Yandex Metrika MCP

Yandex Metrika MCP

A Model Context Protocol (MCP) server that provides access to Yandex Metrika analytics data through various tools and functions. This server allows AI assistants and applications to retrieve comprehensive analytics data from Yandex Metrika accounts.

Hono Remote Mcp Sample

Hono Remote Mcp Sample

원격 MCP 서버 샘플 (Hono + Cloudflare Workers + Durable Objects)

MCP Memory

MCP Memory

Enables AI assistants to remember user information and preferences across conversations using vector search technology. Built on Cloudflare infrastructure with isolated user namespaces for secure, persistent memory storage.

mcp-file-server

mcp-file-server

MCP File System Server for Claude Desktop

Jira Prompts MCP Server

Jira Prompts MCP Server

Jira 콘텐츠로부터 프롬프트 또는 컨텍스트를 생성하는 여러 명령어를 제공하는 MCP 서버

Knowledge Graph Memory Server

Knowledge Graph Memory Server

사용자 정의 가능한 --memory-path를 사용하여 로컬 지식 그래프를 활용하는 향상된 영구 메모리 구현입니다. 이를 통해 Claude는 채팅 전반에 걸쳐 사용자에 대한 정보를 기억할 수 있습니다.

MCP Cheat Engine Server

MCP Cheat Engine Server

Provides safe, read-only access to memory analysis and debugging functionality through the Model Context Protocol, allowing users to examine computer memory for software development, security research, and educational purposes.

Openfort MCP Server

Openfort MCP Server

Enables AI assistants to interact with Openfort's wallet infrastructure, allowing them to create projects, manage configurations, generate wallets and users, and query documentation through 42 integrated tools.

面试鸭 MCP Server

面试鸭 MCP Server

Spring AI 기반 면접 오리 검색 문제의 MCP 서버 서비스로, AI가 기업 면접 실제 문제와 답변을 빠르게 검색할 수 있도록 합니다.

Pearch

Pearch

This project provides a tool for searching people using the Pearch.ai, implemented as a FastMCP service.

MCPStudio: The Postman for Model Context Protocol

MCPStudio: The Postman for Model Context Protocol

MCP 서버용 Postman

Aws Sample Gen Ai Mcp Server

Aws Sample Gen Ai Mcp Server

Okay, here's a sample code snippet demonstrating how to use Gen-AI (specifically, Bedrock) with an MCP (presumably, a message processing or communication platform) server. I'll provide a Python example, as it's commonly used for both Gen-AI and server-side scripting. I'll make some assumptions about your MCP server's API, but you'll need to adapt it to your specific setup. **Conceptual Overview** 1. **Receive Message:** The MCP server receives a message (e.g., from a user). 2. **Extract Text:** The relevant text from the message is extracted. 3. **Send to Bedrock:** The text is sent to Amazon Bedrock for processing (e.g., text generation, summarization, translation). 4. **Receive Response:** Bedrock returns a response. 5. **Format Response:** The response is formatted for sending back to the user. 6. **Send to MCP Server:** The formatted response is sent back to the MCP server to be delivered to the user. **Python Example (using Boto3 for Bedrock)** ```python import boto3 import json from flask import Flask, request, jsonify # Example using Flask for MCP server app = Flask(__name__) # Configure AWS Bedrock bedrock = boto3.client( service_name='bedrock-runtime', region_name='your-aws-region', # Replace with your AWS region (e.g., 'us-east-1') endpoint_url='your-bedrock-endpoint-url' # Replace with your Bedrock endpoint URL if needed ) model_id = 'anthropic.claude-v2' # Example model ID. Change as needed. # MCP Server Endpoint (Example using Flask) @app.route('/process_message', methods=['POST']) def process_message(): try: data = request.get_json() user_message = data.get('message') # Assuming the message is in a 'message' field if not user_message: return jsonify({'error': 'No message provided'}), 400 # Call Bedrock bedrock_response = invoke_bedrock(user_message) # Format the response (adapt to your MCP server's requirements) formatted_response = { 'response': bedrock_response, 'original_message': user_message } # Send the response back to the MCP server (in this example, we just return it) return jsonify(formatted_response), 200 except Exception as e: print(f"Error processing message: {e}") return jsonify({'error': str(e)}), 500 def invoke_bedrock(prompt): """Invokes the Bedrock model with the given prompt.""" body = json.dumps({ "prompt": prompt, "max_tokens_to_sample": 200, # Adjust as needed "temperature": 0.5, # Adjust as needed "top_p": 0.9 # Adjust as needed }) try: response = bedrock.invoke_model( modelId=model_id, contentType="application/json", accept="application/json", body=body ) response_body = json.loads(response['body'].read()) completion = response_body['completion'] return completion except Exception as e: print(f"Error invoking Bedrock: {e}") return f"Error: {e}" if __name__ == '__main__': app.run(debug=True, port=5000) # Adjust port as needed ``` **Explanation:** 1. **Imports:** Imports necessary libraries (boto3 for Bedrock, Flask for the MCP server example, json for data handling). 2. **Bedrock Configuration:** * `boto3.client('bedrock-runtime', ...)`: Creates a Bedrock client. **Crucially, replace `'your-aws-region'` with your AWS region (e.g., `'us-east-1'`) and `'your-bedrock-endpoint-url'` with your Bedrock endpoint URL if needed.** The endpoint URL is usually only needed if you're using a custom endpoint or a specific Bedrock feature. * `model_id`: Specifies the Bedrock model to use (e.g., `'anthropic.claude-v2'`). **Change this to the model you want to use.** See the Bedrock documentation for available models. 3. **MCP Server (Flask Example):** * `@app.route('/process_message', methods=['POST'])`: Defines a route that listens for POST requests at `/process_message`. This simulates your MCP server receiving a message. * `request.get_json()`: Parses the JSON data from the incoming request. **Adapt this to how your MCP server sends data.** * `user_message = data.get('message')`: Extracts the user's message from the JSON data. **Adjust the key (`'message'`) to match the structure of your MCP server's messages.** 4. **`invoke_bedrock(prompt)` Function:** * Constructs the JSON payload for the Bedrock API. The payload includes: * `prompt`: The user's message (the text you want Bedrock to process). * `max_tokens_to_sample`: The maximum number of tokens Bedrock should generate in its response. Adjust this based on the expected length of the response. * `temperature`: Controls the randomness of the output. Higher values (e.g., 0.7) produce more random and creative results. Lower values (e.g., 0.2) produce more predictable results. * `top_p`: Another parameter that controls randomness. It's often used in conjunction with `temperature`. * `bedrock.invoke_model(...)`: Calls the Bedrock API. * Parses the JSON response from Bedrock and extracts the `completion` (the generated text). * Includes error handling. 5. **Response Formatting:** * `formatted_response`: Creates a dictionary containing the Bedrock response and the original message. **Adapt this to the format your MCP server expects.** You might need to include user IDs, timestamps, or other metadata. 6. **Sending the Response:** * `return jsonify(formatted_response), 200`: Returns the formatted response as JSON. **In a real MCP server, you would send this response to the appropriate API endpoint to deliver it to the user.** This might involve using a different library (e.g., `requests` to make an HTTP request to your MCP server). 7. **Error Handling:** The code includes `try...except` blocks to catch potential errors (e.g., network issues, invalid JSON). **To Run This Example:** 1. **Install Dependencies:** ```bash pip install boto3 flask ``` 2. **Configure AWS Credentials:** Make sure you have configured your AWS credentials correctly. The easiest way is usually to use the AWS CLI: ```bash aws configure ``` You'll need an AWS account with access to Bedrock. You might need to request access to specific models in the Bedrock console. 3. **Replace Placeholders:** Replace the placeholder values for `your-aws-region`, `your-bedrock-endpoint-url`, and `model_id`. 4. **Run the Script:** ```bash python your_script_name.py ``` 5. **Test the Endpoint:** You can test the endpoint using `curl` or a tool like Postman: ```bash curl -X POST -H "Content-Type: application/json" -d '{"message": "Tell me a short story about a cat."}' http://localhost:5000/process_message ``` **Important Considerations:** * **Security:** If you're deploying this to a production environment, make sure to secure your MCP server and Bedrock API calls. Use appropriate authentication and authorization mechanisms. * **Error Handling:** Implement robust error handling to gracefully handle failures and provide informative error messages. * **Rate Limiting:** Be aware of Bedrock's rate limits and implement appropriate rate limiting in your code to avoid being throttled. * **Cost:** Bedrock usage can incur costs. Monitor your usage and set budgets to avoid unexpected charges. * **Asynchronous Processing:** For high-volume applications, consider using asynchronous processing (e.g., with Celery or AWS SQS) to avoid blocking the MCP server while waiting for Bedrock responses. * **MCP Server Integration:** The most important part is adapting the code to your specific MCP server's API. You'll need to understand how to receive messages from the server, how to format responses, and how to send responses back to the server. **Korean Translation (of the important parts):** * **`your-aws-region`**: 본인의 AWS 리전으로 변경하세요 (예: `us-east-1`). * **`your-bedrock-endpoint-url`**: 필요하다면 본인의 Bedrock 엔드포인트 URL로 변경하세요. * **`model_id`**: 사용하고 싶은 Bedrock 모델 ID로 변경하세요 (예: `anthropic.claude-v2`). Bedrock 문서에서 사용 가능한 모델을 확인하세요. * **MCP 서버 API에 맞게 코드를 조정해야 합니다.** 메시지를 받는 방법, 응답을 포맷하는 방법, 응답을 다시 서버로 보내는 방법을 이해해야 합니다. * **보안:** 프로덕션 환경에 배포하는 경우 MCP 서버와 Bedrock API 호출을 안전하게 보호해야 합니다. 적절한 인증 및 권한 부여 메커니즘을 사용하십시오. * **요금:** Bedrock 사용에는 비용이 발생할 수 있습니다. 사용량을 모니터링하고 예기치 않은 요금이 발생하지 않도록 예산을 설정하십시오. This comprehensive example should give you a solid starting point for integrating Gen-AI (Bedrock) with your MCP server. Remember to adapt the code to your specific requirements and environment. Good luck!

MCP Weather Server

MCP Weather Server

A Model Context Protocol server that provides tools to fetch weather alerts for US states and forecasts based on latitude/longitude coordinates using the US National Weather Service API.

PDFSizeAnalyzer-MCP

PDFSizeAnalyzer-MCP

Enables comprehensive PDF analysis and manipulation including page size analysis, chapter extraction, splitting, compression, merging, and conversion to images. Provides both MCP server interface for AI assistants and Streamlit web interface for direct user interaction.

Open States API MCP Server

Open States API MCP Server

This MCP server enables interaction with the Open States API, allowing users to access legislative data from US state governments through natural language commands.

Inoreader MCP Server

Inoreader MCP Server

Enables intelligent RSS feed management and analysis through Inoreader integration. Supports reading articles, search, bulk operations, and AI-powered content analysis including summarization, trend analysis, and sentiment analysis.

database-updater MCP Server

database-updater MCP Server

거울

mcp-workflowy

mcp-workflowy

mcp-workflowy

Berghain Events MCP Server

Berghain Events MCP Server

A server that allows AI agents to query and retrieve information about upcoming events at Berghain nightclub through a DynamoDB-backed FastAPI service.

GitHub Integration Hub

GitHub Integration Hub

Enables AI agents to interact with GitHub through OAuth-authenticated operations including starting authorization flows, listing repositories, and creating issues using stored access tokens.

Shopify MCP Server by CData

Shopify MCP Server by CData

Shopify MCP Server by CData

Cloud Translation API MCP Server

Cloud Translation API MCP Server

An MCP (Multi-Agent Conversation Protocol) Server that enables AI agents to interact with Google's Cloud Translation API for translating text between languages.

Context7 MCP Server

Context7 MCP Server

Context7 MCP 서버

MCP Unity Bridge Asset

MCP Unity Bridge Asset

Asset to be imported into Unity to host a WebSocket server for MCP Conmmunciation with LLMs

Crypto Trader MCP Tool

Crypto Trader MCP Tool

Provides cryptocurrency market data using the CoinGecko API

PDF Redaction MCP Server

PDF Redaction MCP Server

Enables loading, reviewing, and redacting sensitive content in PDF documents through text-based or area-based redaction methods. Supports customizable redaction appearance and saves redacted PDFs with comprehensive error handling.

MCP Fly Deployer

MCP Fly Deployer

표준 입출력(stdio) 기반 MCP 서버를 Fly.IO와 같은 플랫폼에 배포할 수 있도록 Docker 파일을 제공하는 MCP 서버

Spotify MCP Server

Spotify MCP Server

Enables interaction with Spotify through natural language for music discovery, playback control, library management, and playlist creation. Supports searching for music, controlling playback, managing saved tracks, and getting personalized recommendations based on mood and preferences.

MCP Create Server

MCP Create Server

테스트 중인 MCP 서버