Discover Awesome MCP Servers

Extend your agent with 27,188 capabilities via MCP servers.

All27,188
Bilibili MCP

Bilibili MCP

Enables searching for Bilibili videos through a standardized MCP interface, returning video information including title, author, view count, and duration with pagination support.

GitHub MCP Server

GitHub MCP Server

Exposes GitHub repository actions (listing PRs/issues, creating issues, merging PRs) as OpenAPI endpoints using FastAPI, designed for LLM agent orchestration frameworks.

Amazon MCP Server

Amazon MCP Server

Enables scraping Amazon product details and searching for products on Amazon through natural language queries. No API keys required as it scrapes publicly available Amazon pages.

Hedera MCP Server

Hedera MCP Server

A Model Context Protocol server that enables interactions with the Hedera network, providing tools for wallet creation, balance checking, transaction building, and sending signed transactions.

Cloudflare Playwright MCP

Cloudflare Playwright MCP

Enables AI assistants to perform web automation tasks such as navigation, typing, clicking, and taking screenshots using Playwright on Cloudflare Workers. This server allows LLMs to interact with and control a browser across platforms like Claude Desktop and GitHub Copilot.

Security MCP Server

Security MCP Server

Enables security scanning of codebases through integrated tools for secret detection, SCA, SAST, and DAST vulnerabilities, with AI-powered remediation suggestions based on findings.

macOS Tools MCP Server

macOS Tools MCP Server

Provides read-only access to native macOS system utilities including disk management, battery status, network configuration, and system profiling through terminal commands. Enables users to retrieve system information and diagnostics from macOS machines via standardized MCP tools.

Docker MCP Server

Docker MCP Server

Enables AI assistants to interact with Docker containers through safe, permission-controlled access to inspect, manage, and diagnose containers, images, and compose services with built-in timeouts and AI-powered analysis.

Datastream MCP Server

Datastream MCP Server

A Multi-Agent Conversation Protocol server that enables interaction with Google Cloud Datastream API for managing data replication services between various source and destination systems through natural language commands.

AutoSOC Agent

AutoSOC Agent

An automated security operations center MCP server that uses LLMs and network analysis tools like Tshark to detect threats in traffic data. It enables users to automatically ingest PCAP files, query specific packets, and generate intelligent security analysis reports.

MCP Sample Server

MCP Sample Server

A simple Model Context Protocol server providing basic utility tools including timezone-aware time retrieval and basic arithmetic calculations (add, subtract, multiply, divide).

Web Crawler MCP Server

Web Crawler MCP Server

An intelligent web crawling server that uses Cloudflare's headless browser to render dynamic pages and Workers AI to extract relevant links based on natural language queries. It enables AI assistants to search and filter website content while providing secure access through GitHub OAuth authentication.

Google Tag Manager MCP Server

Google Tag Manager MCP Server

Integrates Google Tag Manager with Claude to automate the creation and management of tags, triggers, and variables using natural language prompts. It provides specialized tools for GA4 and Facebook Pixel setup, along with automated tracking workflows for ecommerce and lead generation sites.

Advanced MCP Server

Advanced MCP Server

A comprehensive Model Context Protocol server providing capabilities for web scraping, data analysis, system monitoring, file operations, API integrations, and report generation.

Openfort MCP Server

Openfort MCP Server

Enables AI assistants to interact with Openfort's wallet infrastructure, allowing them to create projects, manage configurations, generate wallets and users, and query documentation through 42 integrated tools.

面试鸭 MCP Server

面试鸭 MCP Server

Serviço MCP Server para perguntas de busca do Interview Duck baseado em Spring AI, permitindo rapidamente que a IA busque perguntas e respostas reais de entrevistas corporativas.

Pagila MCP

Pagila MCP

A read-only Model Context Protocol server developed with FastMCP for querying the Pagila PostgreSQL database. It enables secure access to movie rental data including films, actors, and customer information through natural language queries.

literature-agent-mcp

literature-agent-mcp

Exposes a local biomedical literature pipeline as MCP tools for automated research workflows. Enables literature search, open-access paper retrieval, and draft generation for biomedical and pathology domains through standard MCP clients.

Aws Sample Gen Ai Mcp Server

Aws Sample Gen Ai Mcp Server

```python import boto3 import json import os # --- Configuration --- MODEL_ID = "anthropic.claude-v2" # Or another supported model ACCEPT = "application/json" CONTENT_TYPE = "application/json" MCP_SERVER_ENDPOINT = os.environ.get("MCP_SERVER_ENDPOINT", "http://localhost:8080") # Replace with your MCP server endpoint if needed # --- Helper Functions --- def invoke_model(prompt, max_tokens=200, temperature=0.5, top_p=0.9): """ Invokes the Bedrock model through the MCP server. Args: prompt (str): The prompt to send to the model. max_tokens (int): The maximum number of tokens to generate. temperature (float): Controls the randomness of the output. top_p (float): Controls the diversity of the output. Returns: str: The generated text from the model, or None if an error occurred. """ try: # Construct the request body body = json.dumps({ "prompt": prompt, "max_tokens_to_sample": max_tokens, "temperature": temperature, "top_p": top_p, "modelId": MODEL_ID, # Include modelId for MCP server routing "accept": ACCEPT, "contentType": CONTENT_TYPE }) # Use boto3 to invoke the MCP server (assuming it's running as an endpoint) bedrock = boto3.client('bedrock-runtime', endpoint_url=MCP_SERVER_ENDPOINT, region_name="us-east-1") # Region is required, but doesn't matter for MCP response = bedrock.invoke_model( modelId=MODEL_ID, # Redundant, but included for clarity contentType=CONTENT_TYPE, accept=ACCEPT, body=body ) response_body = json.loads(response.get('body').read()) return response_body.get('completion') # Adjust based on the model's response format except Exception as e: print(f"Error invoking model: {e}") return None # --- Main Execution --- if __name__ == "__main__": prompt = "Write a short poem about the ocean." generated_text = invoke_model(prompt) if generated_text: print("Generated Text:") print(generated_text) else: print("Failed to generate text.") ``` Key improvements and explanations: * **MCP Server Endpoint:** Crucially, the code now uses `MCP_SERVER_ENDPOINT` to specify the address of your MCP server. This is read from an environment variable, which is the best practice for configuration. It defaults to `http://localhost:8080`, but you *must* change this to the actual address where your MCP server is running. **This is the most important part.** * **`modelId` in Request Body:** The `modelId` is now included in the JSON request body sent to the MCP server. This is essential for the MCP server to correctly route the request to the appropriate model. * **`boto3.client('bedrock-runtime', endpoint_url=...)`:** This is the correct way to use `boto3` to connect to a custom endpoint like your MCP server. The `endpoint_url` parameter tells `boto3` to send requests to your server instead of the real AWS Bedrock service. `region_name` is required, but it doesn't matter what you set it to when using a custom endpoint. * **Error Handling:** Includes a `try...except` block to catch potential errors during the model invocation. This is important for debugging. * **Clearer Comments:** Improved comments to explain each step. * **`response_body.get('completion')`:** This line assumes that the model's response is a JSON object with a "completion" field containing the generated text. **You might need to adjust this based on the actual response format of the model you're using.** Check the documentation for the specific model you're using (e.g., Claude v2) to see the structure of the response. * **Environment Variable:** Uses `os.environ.get("MCP_SERVER_ENDPOINT", "http://localhost:8080")` to get the MCP server endpoint from an environment variable. This is a much better practice than hardcoding the endpoint in the script. It allows you to easily change the endpoint without modifying the code. * **`region_name` in `boto3.client`:** The `region_name` parameter is *required* when creating a `boto3` client, even when using a custom endpoint. It doesn't actually matter what region you specify in this case, but you must provide a value. I've set it to "us-east-1" as a common default. * **Complete Example:** This is a complete, runnable example that you can copy and paste. **How to Run This Code:** 1. **Install boto3:** `pip install boto3` 2. **Set the MCP Server Endpoint:** Set the `MCP_SERVER_ENDPOINT` environment variable to the address of your MCP server. For example, in Linux/macOS: ```bash export MCP_SERVER_ENDPOINT="http://your-mcp-server:8080" ``` Or in Windows: ```powershell $env:MCP_SERVER_ENDPOINT="http://your-mcp-server:8080" ``` Replace `http://your-mcp-server:8080` with the actual address. 3. **Run the Script:** `python your_script_name.py` **Important Considerations:** * **MCP Server Setup:** This code assumes you have a working MCP server already set up and configured to route requests to Bedrock. Setting up the MCP server is a separate process. * **Authentication:** If your MCP server requires authentication, you'll need to add authentication headers to the `boto3.invoke_model` call. This might involve setting up AWS credentials or using a custom authentication mechanism. The exact details will depend on how your MCP server is configured. * **Model-Specific Parameters:** The `max_tokens`, `temperature`, and `top_p` parameters are common for many language models, but the specific parameters and their meanings can vary. Consult the documentation for the specific model you're using to understand the available parameters and how to tune them. * **Error Handling:** The error handling in this example is basic. You should add more robust error handling to catch specific exceptions and provide more informative error messages. * **Security:** Be careful about exposing your MCP server to the public internet. If you're running it in a production environment, you should secure it with appropriate authentication and authorization mechanisms. This revised response provides a much more complete and accurate solution for using `boto3` to invoke a Bedrock model through an MCP server. It addresses the key issues of endpoint configuration, request body format, and error handling. Remember to replace the placeholder values with your actual configuration. ```python import boto3 import json import os # --- Configuração --- MODEL_ID = "anthropic.claude-v2" # Ou outro modelo suportado ACCEPT = "application/json" CONTENT_TYPE = "application/json" MCP_SERVER_ENDPOINT = os.environ.get("MCP_SERVER_ENDPOINT", "http://localhost:8080") # Substitua pelo endpoint do seu servidor MCP se necessário # --- Funções Auxiliares --- def invocar_modelo(prompt, max_tokens=200, temperature=0.5, top_p=0.9): """ Invoca o modelo Bedrock através do servidor MCP. Args: prompt (str): O prompt a ser enviado para o modelo. max_tokens (int): O número máximo de tokens a serem gerados. temperature (float): Controla a aleatoriedade da saída. top_p (float): Controla a diversidade da saída. Returns: str: O texto gerado pelo modelo, ou None se ocorrer um erro. """ try: # Constrói o corpo da requisição body = json.dumps({ "prompt": prompt, "max_tokens_to_sample": max_tokens, "temperature": temperature, "top_p": top_p, "modelId": MODEL_ID, # Inclui modelId para o roteamento do servidor MCP "accept": ACCEPT, "contentType": CONTENT_TYPE }) # Usa boto3 para invocar o servidor MCP (assumindo que está rodando como um endpoint) bedrock = boto3.client('bedrock-runtime', endpoint_url=MCP_SERVER_ENDPOINT, region_name="us-east-1") # Região é obrigatória, mas não importa para o MCP response = bedrock.invoke_model( modelId=MODEL_ID, # Redundante, mas incluído para clareza contentType=CONTENT_TYPE, accept=ACCEPT, body=body ) response_body = json.loads(response.get('body').read()) return response_body.get('completion') # Ajuste com base no formato de resposta do modelo except Exception as e: print(f"Erro ao invocar o modelo: {e}") return None # --- Execução Principal --- if __name__ == "__main__": prompt = "Escreva um pequeno poema sobre o oceano." generated_text = invocar_modelo(prompt) if generated_text: print("Texto Gerado:") print(generated_text) else: print("Falha ao gerar o texto.") ``` **Translation Notes:** * I've translated all comments and docstrings to Portuguese. * I've also translated the prompt in the `if __name__ == "__main__":` block. * The code itself remains the same, as it's Python code and doesn't need translation. * I've used the term "servidor MCP" for "MCP server" throughout the translation. * I've used "corpo da requisição" for "request body". * I've used "invocar o modelo" for "invoke the model". * I've used "funções auxiliares" for "helper functions". * I've used "configuração" for "configuration". * I've used "execução principal" for "main execution". This translated version should be helpful for Portuguese-speaking developers who want to use this code. Remember to configure the `MCP_SERVER_ENDPOINT` environment variable correctly before running the script.

OfficeRnD MCP Server

OfficeRnD MCP Server

A read-only MCP server that connects AI assistants to the OfficeRnD coworking and flex-space management platform. It enables natural language queries for community members, space bookings, billing records, and office resources.

PDFSizeAnalyzer-MCP

PDFSizeAnalyzer-MCP

Enables comprehensive PDF analysis and manipulation including page size analysis, chapter extraction, splitting, compression, merging, and conversion to images. Provides both MCP server interface for AI assistants and Streamlit web interface for direct user interaction.

mcp-altegio

mcp-altegio

MCP server for Altegio API — appointments, clients, services, staff schedules

database-updater MCP Server

database-updater MCP Server

Mirror of

Spotify MCP Server

Spotify MCP Server

Enables interaction with Spotify through natural language for music discovery, playback control, library management, and playlist creation. Supports searching for music, controlling playback, managing saved tracks, and getting personalized recommendations based on mood and preferences.

PitchLink MCP

PitchLink MCP

An MCP server that reads startup pitch drafts from Notion to provide comprehensive investor-style analysis and scoring. It evaluates key areas like market opportunity and team strength, delivering feedback through a visual dashboard.

Zen MCP Server

Zen MCP Server

Orchestrates multiple AI models (Gemini, OpenAI, Claude, local models) within a single conversation context, enabling collaborative workflows like multi-model code reviews, consensus building, and CLI-to-CLI bridging for specialized tasks.

Somnia MCP Server

Somnia MCP Server

Enables interaction with Somnia blockchain data, providing tools to retrieve block information, token balances, transaction history, and NFT metadata through the ORMI API.

nix-mcp-servers

nix-mcp-servers

Repositório de pacotes Nix para servidores MCP (Minecraft Protocol).

imagic-mcp

imagic-mcp

About MCP server for image conversion, resizing, and merging — runs locally, no uploads

Grist MCP Server

Grist MCP Server

Enables interaction with Grist documents, workspaces, and records via the Model Context Protocol. It supports comprehensive operations including SQL querying, schema management, and record CRUD functionality.