Discover Awesome MCP Servers
Extend your agent with 26,519 capabilities via MCP servers.
- All26,519
- Developer Tools3,867
- Search1,714
- Research & Data1,557
- AI Integration Systems229
- Cloud Platforms219
- Data & App Analysis181
- Database Interaction177
- Remote Shell Execution165
- Browser Automation147
- Databases145
- Communication137
- AI Content Generation127
- OS Automation120
- Programming Docs Access109
- Content Fetching108
- Note Taking97
- File Systems96
- Version Control93
- Finance91
- Knowledge & Memory90
- Monitoring79
- Security71
- Image & Video Processing69
- Digital Note Management66
- AI Memory Systems62
- Advanced AI Reasoning59
- Git Management Tools58
- Cloud Storage51
- Entertainment & Media43
- Virtualization42
- Location Services35
- Web Automation & Stealth32
- Media Content Processing32
- Calendar Management26
- Ecommerce & Retail18
- Speech Processing18
- Customer Data Platforms16
- Travel & Transportation14
- Education & Learning Tools13
- Home Automation & IoT13
- Web Search Integration12
- Health & Wellness10
- Customer Support10
- Marketing9
- Games & Gamification8
- Google Cloud Integrations7
- Art & Culture4
- Language Translation3
- Legal & Compliance2
Database MCP Server
Provides universal database operations for AI assistants through MCP, supporting 40+ databases including PostgreSQL, MySQL, MongoDB, Redis, and SQLite with built-in introspection tools for schema exploration.
Screen Agent
A Windows desktop automation MCP server that enables UI recognition through OCR, UIA controls, and multi-point color matching. It allows agents to interact with desktop applications via actions like clicking and typing while using a learning system to track and improve operation success.
browser-mcp
Um servidor MCP que permite que assistentes de IA interajam com o navegador, incluindo obter o conteúdo da página como Markdown, modificar os estilos da página e pesquisar o histórico do navegador.
agentfolio-mcp-server
MCP server for AgentFolio — the identity and reputation layer for AI agents. Query agent profiles, trust scores, verification status, and marketplace listings through 8 MCP tools.
Medicare MCP Server
Provides comprehensive access to CMS Medicare data including physician services, prescriber information, hospital quality metrics, drug spending, formulary coverage, and ASP pricing for healthcare analysis and decision-making.
Google Slides MCP Server
Enables interaction with Google Slides presentations through OAuth2 authentication. Supports creating new slides, adding rectangles, and managing presentation content through natural language commands.
macOS Tools MCP Server
Provides read-only access to native macOS system utilities including disk management, battery status, network configuration, and system profiling through terminal commands. Enables users to retrieve system information and diagnostics from macOS machines via standardized MCP tools.
Docker MCP Server
Enables AI assistants to interact with Docker containers through safe, permission-controlled access to inspect, manage, and diagnose containers, images, and compose services with built-in timeouts and AI-powered analysis.
Datastream MCP Server
A Multi-Agent Conversation Protocol server that enables interaction with Google Cloud Datastream API for managing data replication services between various source and destination systems through natural language commands.
AutoSOC Agent
An automated security operations center MCP server that uses LLMs and network analysis tools like Tshark to detect threats in traffic data. It enables users to automatically ingest PCAP files, query specific packets, and generate intelligent security analysis reports.
MCP Sample Server
A simple Model Context Protocol server providing basic utility tools including timezone-aware time retrieval and basic arithmetic calculations (add, subtract, multiply, divide).
Openfort MCP Server
Enables AI assistants to interact with Openfort's wallet infrastructure, allowing them to create projects, manage configurations, generate wallets and users, and query documentation through 42 integrated tools.
Pagila MCP
A read-only Model Context Protocol server developed with FastMCP for querying the Pagila PostgreSQL database. It enables secure access to movie rental data including films, actors, and customer information through natural language queries.
Aws Sample Gen Ai Mcp Server
```python import boto3 import json import os # --- Configuration --- MODEL_ID = "anthropic.claude-v2" # Or another supported model ACCEPT = "application/json" CONTENT_TYPE = "application/json" MCP_SERVER_ENDPOINT = os.environ.get("MCP_SERVER_ENDPOINT", "http://localhost:8080") # Replace with your MCP server endpoint if needed # --- Helper Functions --- def invoke_model(prompt, max_tokens=200, temperature=0.5, top_p=0.9): """ Invokes the Bedrock model through the MCP server. Args: prompt (str): The prompt to send to the model. max_tokens (int): The maximum number of tokens to generate. temperature (float): Controls the randomness of the output. top_p (float): Controls the diversity of the output. Returns: str: The generated text from the model, or None if an error occurred. """ try: # Construct the request body body = json.dumps({ "prompt": prompt, "max_tokens_to_sample": max_tokens, "temperature": temperature, "top_p": top_p, "modelId": MODEL_ID, # Include modelId for MCP server routing "accept": ACCEPT, "contentType": CONTENT_TYPE }) # Use boto3 to invoke the MCP server (assuming it's running as an endpoint) bedrock = boto3.client('bedrock-runtime', endpoint_url=MCP_SERVER_ENDPOINT, region_name="us-east-1") # Region is required, but doesn't matter for MCP response = bedrock.invoke_model( modelId=MODEL_ID, # Redundant, but included for clarity contentType=CONTENT_TYPE, accept=ACCEPT, body=body ) response_body = json.loads(response.get('body').read()) return response_body.get('completion') # Adjust based on the model's response format except Exception as e: print(f"Error invoking model: {e}") return None # --- Main Execution --- if __name__ == "__main__": prompt = "Write a short poem about the ocean." generated_text = invoke_model(prompt) if generated_text: print("Generated Text:") print(generated_text) else: print("Failed to generate text.") ``` Key improvements and explanations: * **MCP Server Endpoint:** Crucially, the code now uses `MCP_SERVER_ENDPOINT` to specify the address of your MCP server. This is read from an environment variable, which is the best practice for configuration. It defaults to `http://localhost:8080`, but you *must* change this to the actual address where your MCP server is running. **This is the most important part.** * **`modelId` in Request Body:** The `modelId` is now included in the JSON request body sent to the MCP server. This is essential for the MCP server to correctly route the request to the appropriate model. * **`boto3.client('bedrock-runtime', endpoint_url=...)`:** This is the correct way to use `boto3` to connect to a custom endpoint like your MCP server. The `endpoint_url` parameter tells `boto3` to send requests to your server instead of the real AWS Bedrock service. `region_name` is required, but it doesn't matter what you set it to when using a custom endpoint. * **Error Handling:** Includes a `try...except` block to catch potential errors during the model invocation. This is important for debugging. * **Clearer Comments:** Improved comments to explain each step. * **`response_body.get('completion')`:** This line assumes that the model's response is a JSON object with a "completion" field containing the generated text. **You might need to adjust this based on the actual response format of the model you're using.** Check the documentation for the specific model you're using (e.g., Claude v2) to see the structure of the response. * **Environment Variable:** Uses `os.environ.get("MCP_SERVER_ENDPOINT", "http://localhost:8080")` to get the MCP server endpoint from an environment variable. This is a much better practice than hardcoding the endpoint in the script. It allows you to easily change the endpoint without modifying the code. * **`region_name` in `boto3.client`:** The `region_name` parameter is *required* when creating a `boto3` client, even when using a custom endpoint. It doesn't actually matter what region you specify in this case, but you must provide a value. I've set it to "us-east-1" as a common default. * **Complete Example:** This is a complete, runnable example that you can copy and paste. **How to Run This Code:** 1. **Install boto3:** `pip install boto3` 2. **Set the MCP Server Endpoint:** Set the `MCP_SERVER_ENDPOINT` environment variable to the address of your MCP server. For example, in Linux/macOS: ```bash export MCP_SERVER_ENDPOINT="http://your-mcp-server:8080" ``` Or in Windows: ```powershell $env:MCP_SERVER_ENDPOINT="http://your-mcp-server:8080" ``` Replace `http://your-mcp-server:8080` with the actual address. 3. **Run the Script:** `python your_script_name.py` **Important Considerations:** * **MCP Server Setup:** This code assumes you have a working MCP server already set up and configured to route requests to Bedrock. Setting up the MCP server is a separate process. * **Authentication:** If your MCP server requires authentication, you'll need to add authentication headers to the `boto3.invoke_model` call. This might involve setting up AWS credentials or using a custom authentication mechanism. The exact details will depend on how your MCP server is configured. * **Model-Specific Parameters:** The `max_tokens`, `temperature`, and `top_p` parameters are common for many language models, but the specific parameters and their meanings can vary. Consult the documentation for the specific model you're using to understand the available parameters and how to tune them. * **Error Handling:** The error handling in this example is basic. You should add more robust error handling to catch specific exceptions and provide more informative error messages. * **Security:** Be careful about exposing your MCP server to the public internet. If you're running it in a production environment, you should secure it with appropriate authentication and authorization mechanisms. This revised response provides a much more complete and accurate solution for using `boto3` to invoke a Bedrock model through an MCP server. It addresses the key issues of endpoint configuration, request body format, and error handling. Remember to replace the placeholder values with your actual configuration. ```python import boto3 import json import os # --- Configuração --- MODEL_ID = "anthropic.claude-v2" # Ou outro modelo suportado ACCEPT = "application/json" CONTENT_TYPE = "application/json" MCP_SERVER_ENDPOINT = os.environ.get("MCP_SERVER_ENDPOINT", "http://localhost:8080") # Substitua pelo endpoint do seu servidor MCP se necessário # --- Funções Auxiliares --- def invocar_modelo(prompt, max_tokens=200, temperature=0.5, top_p=0.9): """ Invoca o modelo Bedrock através do servidor MCP. Args: prompt (str): O prompt a ser enviado para o modelo. max_tokens (int): O número máximo de tokens a serem gerados. temperature (float): Controla a aleatoriedade da saída. top_p (float): Controla a diversidade da saída. Returns: str: O texto gerado pelo modelo, ou None se ocorrer um erro. """ try: # Constrói o corpo da requisição body = json.dumps({ "prompt": prompt, "max_tokens_to_sample": max_tokens, "temperature": temperature, "top_p": top_p, "modelId": MODEL_ID, # Inclui modelId para o roteamento do servidor MCP "accept": ACCEPT, "contentType": CONTENT_TYPE }) # Usa boto3 para invocar o servidor MCP (assumindo que está rodando como um endpoint) bedrock = boto3.client('bedrock-runtime', endpoint_url=MCP_SERVER_ENDPOINT, region_name="us-east-1") # Região é obrigatória, mas não importa para o MCP response = bedrock.invoke_model( modelId=MODEL_ID, # Redundante, mas incluído para clareza contentType=CONTENT_TYPE, accept=ACCEPT, body=body ) response_body = json.loads(response.get('body').read()) return response_body.get('completion') # Ajuste com base no formato de resposta do modelo except Exception as e: print(f"Erro ao invocar o modelo: {e}") return None # --- Execução Principal --- if __name__ == "__main__": prompt = "Escreva um pequeno poema sobre o oceano." generated_text = invocar_modelo(prompt) if generated_text: print("Texto Gerado:") print(generated_text) else: print("Falha ao gerar o texto.") ``` **Translation Notes:** * I've translated all comments and docstrings to Portuguese. * I've also translated the prompt in the `if __name__ == "__main__":` block. * The code itself remains the same, as it's Python code and doesn't need translation. * I've used the term "servidor MCP" for "MCP server" throughout the translation. * I've used "corpo da requisição" for "request body". * I've used "invocar o modelo" for "invoke the model". * I've used "funções auxiliares" for "helper functions". * I've used "configuração" for "configuration". * I've used "execução principal" for "main execution". This translated version should be helpful for Portuguese-speaking developers who want to use this code. Remember to configure the `MCP_SERVER_ENDPOINT` environment variable correctly before running the script.
PDFSizeAnalyzer-MCP
Enables comprehensive PDF analysis and manipulation including page size analysis, chapter extraction, splitting, compression, merging, and conversion to images. Provides both MCP server interface for AI assistants and Streamlit web interface for direct user interaction.
mcp-altegio
MCP server for Altegio API — appointments, clients, services, staff schedules
database-updater MCP Server
Mirror of
Berghain Events MCP Server
A server that allows AI agents to query and retrieve information about upcoming events at Berghain nightclub through a DynamoDB-backed FastAPI service.
GitHub Integration Hub
Enables AI agents to interact with GitHub through OAuth-authenticated operations including starting authorization flows, listing repositories, and creating issues using stored access tokens.
Universal Crypto MCP
Enables AI agents to interact with any EVM-compatible blockchain through natural language, supporting token swaps, cross-chain bridges, staking, lending, governance, gas optimization, and portfolio tracking across networks like Ethereum, BSC, Polygon, Arbitrum, and more.
MCP Unity Bridge Asset
Asset to be imported into Unity to host a WebSocket server for MCP Conmmunciation with LLMs
Obsidian Todos MCP Server
Enables AI assistants to manage tasks within an Obsidian vault by listing, adding, and updating todos via the Local REST API. It allows users to create new todos in daily notes and retrieve task statistics through natural language.
DOMShell
MCP server that turns your browser into a filesystem. 38 tools let AI agents ls, cd, grep, click, and type through Chrome via the DOMShell extension.
ncbi-mcp
Servidor MCP do Centro Nacional de Informações sobre Biologia do NIH (National Institutes of Health).
url-download-mcp
A Model Context Protocol (MCP) server that enables AI assistants to download files from URLs to the local filesystem.
Delphi Build Server
Enables building and cleaning Delphi projects (.dproj/.groupproj) on Windows using MSBuild with RAD Studio environment initialization. Supports both individual projects and group projects with configurable build configurations and platforms.
Black Orchid
A hot-reloadable MCP proxy server that enables users to create and manage custom Python tools through dynamic module loading. Users can build their own utilities, wrap APIs, and extend functionality by simply adding Python files to designated folders.
Web Crawler MCP Server
An intelligent web crawling server that uses Cloudflare's headless browser to render dynamic pages and Workers AI to extract relevant links based on natural language queries. It enables AI assistants to search and filter website content while providing secure access through GitHub OAuth authentication.
Google Tag Manager MCP Server
Integrates Google Tag Manager with Claude to automate the creation and management of tags, triggers, and variables using natural language prompts. It provides specialized tools for GA4 and Facebook Pixel setup, along with automated tracking workflows for ecommerce and lead generation sites.
Advanced MCP Server
A comprehensive Model Context Protocol server providing capabilities for web scraping, data analysis, system monitoring, file operations, API integrations, and report generation.