Discover Awesome MCP Servers
Extend your agent with 23,553 capabilities via MCP servers.
- All23,553
- Developer Tools3,867
- Search1,714
- Research & Data1,557
- AI Integration Systems229
- Cloud Platforms219
- Data & App Analysis181
- Database Interaction177
- Remote Shell Execution165
- Browser Automation147
- Databases145
- Communication137
- AI Content Generation127
- OS Automation120
- Programming Docs Access109
- Content Fetching108
- Note Taking97
- File Systems96
- Version Control93
- Finance91
- Knowledge & Memory90
- Monitoring79
- Security71
- Image & Video Processing69
- Digital Note Management66
- AI Memory Systems62
- Advanced AI Reasoning59
- Git Management Tools58
- Cloud Storage51
- Entertainment & Media43
- Virtualization42
- Location Services35
- Web Automation & Stealth32
- Media Content Processing32
- Calendar Management26
- Ecommerce & Retail18
- Speech Processing18
- Customer Data Platforms16
- Travel & Transportation14
- Education & Learning Tools13
- Home Automation & IoT13
- Web Search Integration12
- Health & Wellness10
- Customer Support10
- Marketing9
- Games & Gamification8
- Google Cloud Integrations7
- Art & Culture4
- Language Translation3
- Legal & Compliance2
Openfort MCP Server
Enables AI assistants to interact with Openfort's wallet infrastructure, allowing them to create projects, manage configurations, generate wallets and users, and query documentation through 42 integrated tools.
面试鸭 MCP Server
Serviço MCP Server para perguntas de busca do Interview Duck baseado em Spring AI, permitindo rapidamente que a IA busque perguntas e respostas reais de entrevistas corporativas.
Zen MCP Server
Orchestrates multiple AI models (Gemini, OpenAI, Claude, local models) within a single conversation context, enabling collaborative workflows like multi-model code reviews, consensus building, and CLI-to-CLI bridging for specialized tasks.
Somnia MCP Server
Enables interaction with Somnia blockchain data, providing tools to retrieve block information, token balances, transaction history, and NFT metadata through the ORMI API.
Aws Sample Gen Ai Mcp Server
```python import boto3 import json import os # --- Configuration --- MODEL_ID = "anthropic.claude-v2" # Or another supported model ACCEPT = "application/json" CONTENT_TYPE = "application/json" MCP_SERVER_ENDPOINT = os.environ.get("MCP_SERVER_ENDPOINT", "http://localhost:8080") # Replace with your MCP server endpoint if needed # --- Helper Functions --- def invoke_model(prompt, max_tokens=200, temperature=0.5, top_p=0.9): """ Invokes the Bedrock model through the MCP server. Args: prompt (str): The prompt to send to the model. max_tokens (int): The maximum number of tokens to generate. temperature (float): Controls the randomness of the output. top_p (float): Controls the diversity of the output. Returns: str: The generated text from the model, or None if an error occurred. """ try: # Construct the request body body = json.dumps({ "prompt": prompt, "max_tokens_to_sample": max_tokens, "temperature": temperature, "top_p": top_p, "modelId": MODEL_ID, # Include modelId for MCP server routing "accept": ACCEPT, "contentType": CONTENT_TYPE }) # Use boto3 to invoke the MCP server (assuming it's running as an endpoint) bedrock = boto3.client('bedrock-runtime', endpoint_url=MCP_SERVER_ENDPOINT, region_name="us-east-1") # Region is required, but doesn't matter for MCP response = bedrock.invoke_model( modelId=MODEL_ID, # Redundant, but included for clarity contentType=CONTENT_TYPE, accept=ACCEPT, body=body ) response_body = json.loads(response.get('body').read()) return response_body.get('completion') # Adjust based on the model's response format except Exception as e: print(f"Error invoking model: {e}") return None # --- Main Execution --- if __name__ == "__main__": prompt = "Write a short poem about the ocean." generated_text = invoke_model(prompt) if generated_text: print("Generated Text:") print(generated_text) else: print("Failed to generate text.") ``` Key improvements and explanations: * **MCP Server Endpoint:** Crucially, the code now uses `MCP_SERVER_ENDPOINT` to specify the address of your MCP server. This is read from an environment variable, which is the best practice for configuration. It defaults to `http://localhost:8080`, but you *must* change this to the actual address where your MCP server is running. **This is the most important part.** * **`modelId` in Request Body:** The `modelId` is now included in the JSON request body sent to the MCP server. This is essential for the MCP server to correctly route the request to the appropriate model. * **`boto3.client('bedrock-runtime', endpoint_url=...)`:** This is the correct way to use `boto3` to connect to a custom endpoint like your MCP server. The `endpoint_url` parameter tells `boto3` to send requests to your server instead of the real AWS Bedrock service. `region_name` is required, but it doesn't matter what you set it to when using a custom endpoint. * **Error Handling:** Includes a `try...except` block to catch potential errors during the model invocation. This is important for debugging. * **Clearer Comments:** Improved comments to explain each step. * **`response_body.get('completion')`:** This line assumes that the model's response is a JSON object with a "completion" field containing the generated text. **You might need to adjust this based on the actual response format of the model you're using.** Check the documentation for the specific model you're using (e.g., Claude v2) to see the structure of the response. * **Environment Variable:** Uses `os.environ.get("MCP_SERVER_ENDPOINT", "http://localhost:8080")` to get the MCP server endpoint from an environment variable. This is a much better practice than hardcoding the endpoint in the script. It allows you to easily change the endpoint without modifying the code. * **`region_name` in `boto3.client`:** The `region_name` parameter is *required* when creating a `boto3` client, even when using a custom endpoint. It doesn't actually matter what region you specify in this case, but you must provide a value. I've set it to "us-east-1" as a common default. * **Complete Example:** This is a complete, runnable example that you can copy and paste. **How to Run This Code:** 1. **Install boto3:** `pip install boto3` 2. **Set the MCP Server Endpoint:** Set the `MCP_SERVER_ENDPOINT` environment variable to the address of your MCP server. For example, in Linux/macOS: ```bash export MCP_SERVER_ENDPOINT="http://your-mcp-server:8080" ``` Or in Windows: ```powershell $env:MCP_SERVER_ENDPOINT="http://your-mcp-server:8080" ``` Replace `http://your-mcp-server:8080` with the actual address. 3. **Run the Script:** `python your_script_name.py` **Important Considerations:** * **MCP Server Setup:** This code assumes you have a working MCP server already set up and configured to route requests to Bedrock. Setting up the MCP server is a separate process. * **Authentication:** If your MCP server requires authentication, you'll need to add authentication headers to the `boto3.invoke_model` call. This might involve setting up AWS credentials or using a custom authentication mechanism. The exact details will depend on how your MCP server is configured. * **Model-Specific Parameters:** The `max_tokens`, `temperature`, and `top_p` parameters are common for many language models, but the specific parameters and their meanings can vary. Consult the documentation for the specific model you're using to understand the available parameters and how to tune them. * **Error Handling:** The error handling in this example is basic. You should add more robust error handling to catch specific exceptions and provide more informative error messages. * **Security:** Be careful about exposing your MCP server to the public internet. If you're running it in a production environment, you should secure it with appropriate authentication and authorization mechanisms. This revised response provides a much more complete and accurate solution for using `boto3` to invoke a Bedrock model through an MCP server. It addresses the key issues of endpoint configuration, request body format, and error handling. Remember to replace the placeholder values with your actual configuration. ```python import boto3 import json import os # --- Configuração --- MODEL_ID = "anthropic.claude-v2" # Ou outro modelo suportado ACCEPT = "application/json" CONTENT_TYPE = "application/json" MCP_SERVER_ENDPOINT = os.environ.get("MCP_SERVER_ENDPOINT", "http://localhost:8080") # Substitua pelo endpoint do seu servidor MCP se necessário # --- Funções Auxiliares --- def invocar_modelo(prompt, max_tokens=200, temperature=0.5, top_p=0.9): """ Invoca o modelo Bedrock através do servidor MCP. Args: prompt (str): O prompt a ser enviado para o modelo. max_tokens (int): O número máximo de tokens a serem gerados. temperature (float): Controla a aleatoriedade da saída. top_p (float): Controla a diversidade da saída. Returns: str: O texto gerado pelo modelo, ou None se ocorrer um erro. """ try: # Constrói o corpo da requisição body = json.dumps({ "prompt": prompt, "max_tokens_to_sample": max_tokens, "temperature": temperature, "top_p": top_p, "modelId": MODEL_ID, # Inclui modelId para o roteamento do servidor MCP "accept": ACCEPT, "contentType": CONTENT_TYPE }) # Usa boto3 para invocar o servidor MCP (assumindo que está rodando como um endpoint) bedrock = boto3.client('bedrock-runtime', endpoint_url=MCP_SERVER_ENDPOINT, region_name="us-east-1") # Região é obrigatória, mas não importa para o MCP response = bedrock.invoke_model( modelId=MODEL_ID, # Redundante, mas incluído para clareza contentType=CONTENT_TYPE, accept=ACCEPT, body=body ) response_body = json.loads(response.get('body').read()) return response_body.get('completion') # Ajuste com base no formato de resposta do modelo except Exception as e: print(f"Erro ao invocar o modelo: {e}") return None # --- Execução Principal --- if __name__ == "__main__": prompt = "Escreva um pequeno poema sobre o oceano." generated_text = invocar_modelo(prompt) if generated_text: print("Texto Gerado:") print(generated_text) else: print("Falha ao gerar o texto.") ``` **Translation Notes:** * I've translated all comments and docstrings to Portuguese. * I've also translated the prompt in the `if __name__ == "__main__":` block. * The code itself remains the same, as it's Python code and doesn't need translation. * I've used the term "servidor MCP" for "MCP server" throughout the translation. * I've used "corpo da requisição" for "request body". * I've used "invocar o modelo" for "invoke the model". * I've used "funções auxiliares" for "helper functions". * I've used "configuração" for "configuration". * I've used "execução principal" for "main execution". This translated version should be helpful for Portuguese-speaking developers who want to use this code. Remember to configure the `MCP_SERVER_ENDPOINT` environment variable correctly before running the script.
PDFSizeAnalyzer-MCP
Enables comprehensive PDF analysis and manipulation including page size analysis, chapter extraction, splitting, compression, merging, and conversion to images. Provides both MCP server interface for AI assistants and Streamlit web interface for direct user interaction.
database-updater MCP Server
Mirror of
mcp-workflowy
mcp-workflowy
GitHub Integration Hub
Enables AI agents to interact with GitHub through OAuth-authenticated operations including starting authorization flows, listing repositories, and creating issues using stored access tokens.
MCP Knowledge Base Server
Provides semantic search and data retrieval capabilities over a knowledge base with multiple tools including keyword search, category filtering, and ID-based lookup with in-memory caching.
Shopify MCP Server by CData
Shopify MCP Server by CData
AWS Amplify Gen 2 Documentation MCP Server
This MCP server provides tools to access AWS Amplify Gen 2 documentation and search for content. (not official)
Advanced MCP Server
A comprehensive Model Context Protocol server providing capabilities for web scraping, data analysis, system monitoring, file operations, API integrations, and report generation.
Sally MCP Server
Enables MCP clients to interact with Sally AI assistant using x402 blockchain-based micropayments for secure, transparent transactions.
MCP Trino Server
A Model Context Protocol server that provides seamless integration with Trino and Iceberg, enabling data exploration, querying, and table maintenance through a standard interface.
browser-mcp
Um servidor MCP que permite que assistentes de IA interajam com o navegador, incluindo obter o conteúdo da página como Markdown, modificar os estilos da página e pesquisar o histórico do navegador.
CourtListener Legal Research MCP Server
Enables legal research across 3,352 U.S. courts using the CourtListener API, providing access to case search, precedent analysis, judge patterns, citation validation, and federal PACER dockets through natural language queries.
MCP with RAG Demo
Este projeto de demonstração mostra como implementar um servidor de Protocolo de Contexto de Modelo (MCP) com capacidades de Geração Aumentada por Recuperação (RAG). A demonstração permite que modelos de IA interajam com uma base de conhecimento, pesquisem informações e adicionem novos documentos.
Poke-MCP
A Model Context Protocol server that provides comprehensive Pokemon data and battle simulation capabilities to AI assistants. It enables users to access detailed stats, types, and moves while simulating battles with realistic mechanics like type effectiveness and status effects.
macOS Tools MCP Server
Provides read-only access to native macOS system utilities including disk management, battery status, network configuration, and system profiling through terminal commands. Enables users to retrieve system information and diagnostics from macOS machines via standardized MCP tools.
Datastream MCP Server
A Multi-Agent Conversation Protocol server that enables interaction with Google Cloud Datastream API for managing data replication services between various source and destination systems through natural language commands.
MCP Sample Server
A simple Model Context Protocol server providing basic utility tools including timezone-aware time retrieval and basic arithmetic calculations (add, subtract, multiply, divide).
WordPress MCP Server
Enables AI models like Claude to manage WordPress sites through the WordPress REST API, supporting operations like post creation, taxonomy management, and site configuration. It features secure authentication via Application Passwords and provides tools for comprehensive content administration.
Web Crawler MCP Server
An intelligent web crawling server that uses Cloudflare's headless browser to render dynamic pages and Workers AI to extract relevant links based on natural language queries. It enables AI assistants to search and filter website content while providing secure access through GitHub OAuth authentication.
Universal Crypto MCP
Enables AI agents to interact with any EVM-compatible blockchain through natural language, supporting token swaps, cross-chain bridges, staking, lending, governance, gas optimization, and portfolio tracking across networks like Ethereum, BSC, Polygon, Arbitrum, and more.
MCP-Pushover Bridge
Enables AI assistants to send push notifications to mobile devices via Pushover, allowing users to receive instant alerts for task completions, errors, reminders, and custom messages through their AI conversations.
MCP Unity Bridge Asset
Asset to be imported into Unity to host a WebSocket server for MCP Conmmunciation with LLMs
url-download-mcp
A Model Context Protocol (MCP) server that enables AI assistants to download files from URLs to the local filesystem.
Black Orchid
A hot-reloadable MCP proxy server that enables users to create and manage custom Python tools through dynamic module loading. Users can build their own utilities, wrap APIs, and extend functionality by simply adding Python files to designated folders.
AutoDev MCP Examples
Here are some examples of MCP (presumably meaning "Minimum Complete Product") for AutoDev, keeping in mind that "AutoDev" likely refers to a system or tool that automates aspects of software development: **Focusing on Code Generation:** * **MCP 1: Basic CRUD Generation:** AutoDev can generate basic Create, Read, Update, and Delete (CRUD) operations for a single, simple data model (e.g., a "Task" object with title and description). This includes generating the database schema, API endpoints, and basic UI forms. *Portuguese Translation: MCP 1: Geração CRUD Básica. AutoDev pode gerar operações básicas de Criação, Leitura, Atualização e Exclusão (CRUD) para um único modelo de dados simples (por exemplo, um objeto "Tarefa" com título e descrição). Isso inclui gerar o esquema do banco de dados, endpoints da API e formulários básicos da interface do usuário.* * **MCP 2: CRUD with Validation:** Builds upon MCP 1 by adding basic input validation to the generated CRUD operations (e.g., required fields, data type validation). *Portuguese Translation: MCP 2: CRUD com Validação. Baseia-se no MCP 1 adicionando validação básica de entrada às operações CRUD geradas (por exemplo, campos obrigatórios, validação de tipo de dados).* * **MCP 3: Code Generation from UML:** AutoDev can generate code (e.g., Python classes) from a simple UML class diagram. *Portuguese Translation: MCP 3: Geração de Código a partir de UML. AutoDev pode gerar código (por exemplo, classes Python) a partir de um diagrama de classes UML simples.* **Focusing on Automated Testing:** * **MCP 1: Unit Test Generation:** AutoDev can generate basic unit tests for a given function or class, covering basic positive and negative cases. *Portuguese Translation: MCP 1: Geração de Testes Unitários. AutoDev pode gerar testes unitários básicos para uma determinada função ou classe, cobrindo casos positivos e negativos básicos.* * **MCP 2: Test Execution and Reporting:** AutoDev can automatically execute the generated unit tests and provide a simple report of the test results (pass/fail). *Portuguese Translation: MCP 2: Execução e Relatório de Testes. AutoDev pode executar automaticamente os testes unitários gerados e fornecer um relatório simples dos resultados dos testes (aprovado/reprovado).* **Focusing on Deployment:** * **MCP 1: Simple Deployment to a Single Environment:** AutoDev can automatically deploy a simple application (e.g., a "Hello World" web app) to a single environment (e.g., a local Docker container). *Portuguese Translation: MCP 1: Implantação Simples em um Único Ambiente. AutoDev pode implantar automaticamente um aplicativo simples (por exemplo, um aplicativo web "Olá Mundo") em um único ambiente (por exemplo, um contêiner Docker local).* * **MCP 2: Deployment with Basic Configuration:** Builds upon MCP 1 by allowing basic configuration of the deployment (e.g., setting environment variables). *Portuguese Translation: MCP 2: Implantação com Configuração Básica. Baseia-se no MCP 1, permitindo a configuração básica da implantação (por exemplo, definir variáveis de ambiente).* **General AutoDev Features:** * **MCP 1: Project Setup:** AutoDev can automatically set up a basic project structure for a given programming language and framework (e.g., creating directories, setting up a virtual environment). *Portuguese Translation: MCP 1: Configuração do Projeto. AutoDev pode configurar automaticamente uma estrutura de projeto básica para uma determinada linguagem de programação e framework (por exemplo, criar diretórios, configurar um ambiente virtual).* * **MCP 2: Dependency Management:** AutoDev can automatically manage project dependencies (e.g., installing packages from a package manager). *Portuguese Translation: MCP 2: Gerenciamento de Dependências. AutoDev pode gerenciar automaticamente as dependências do projeto (por exemplo, instalar pacotes de um gerenciador de pacotes).* **Key Considerations for Defining MCPs:** * **Focus on a specific, valuable task:** Each MCP should address a clear need in the software development process. * **Keep it simple:** The MCP should be the *minimum* functionality required to achieve the task. * **Measurable success:** It should be easy to determine whether the MCP is working correctly. * **Iterative development:** Each MCP should build upon previous ones. Remember to tailor these examples to the specific goals and capabilities of your AutoDev system. Good luck!