Discover Awesome MCP Servers

Extend your agent with 27,150 capabilities via MCP servers.

All27,150
literature-agent-mcp

literature-agent-mcp

Exposes a local biomedical literature pipeline as MCP tools for automated research workflows. Enables literature search, open-access paper retrieval, and draft generation for biomedical and pathology domains through standard MCP clients.

Somnia MCP Server

Somnia MCP Server

Enables interaction with Somnia blockchain data, providing tools to retrieve block information, token balances, transaction history, and NFT metadata through the ORMI API.

Aws Sample Gen Ai Mcp Server

Aws Sample Gen Ai Mcp Server

```python import boto3 import json # Configuration REGION_NAME = 'your-aws-region' # e.g., 'us-east-1' MODEL_ID = 'anthropic.claude-v2' # Or any other Bedrock model ID ACCEPT = 'application/json' CONTENT_TYPE = 'application/json' MCP_SERVER_ENDPOINT = 'your_mcp_server_endpoint' # e.g., 'http://localhost:8080/predictions/bedrock' # Initialize Bedrock client (if needed for direct comparison or setup) bedrock = boto3.client( service_name='bedrock-runtime', region_name=REGION_NAME ) # Function to invoke the model via MCP server def invoke_model_mcp(prompt, max_tokens=200, temperature=0.5, top_p=0.9): """ Invokes the Bedrock model through the MCP server. Args: prompt (str): The prompt to send to the model. max_tokens (int): The maximum number of tokens to generate. temperature (float): The temperature for sampling. top_p (float): The top_p value for sampling. Returns: str: The generated text from the model, or None if an error occurred. """ payload = { "modelId": MODEL_ID, "contentType": CONTENT_TYPE, "accept": ACCEPT, "body": json.dumps({ "prompt": prompt, "max_tokens_to_sample": max_tokens, "temperature": temperature, "top_p": top_p, }) } try: import requests response = requests.post(MCP_SERVER_ENDPOINT, json=payload) response.raise_for_status() # Raise HTTPError for bad responses (4xx or 5xx) response_body = response.json() # Extract the generated text from the response generated_text = response_body['completion'] return generated_text except requests.exceptions.RequestException as e: print(f"Error invoking MCP server: {e}") return None except KeyError as e: print(f"Error parsing MCP server response: Missing key {e}") print(f"Response body: {response_body}") # Print the full response for debugging return None except Exception as e: print(f"An unexpected error occurred: {e}") return None # Example usage if __name__ == "__main__": prompt = "Write a short story about a cat who goes on an adventure." generated_text = invoke_model_mcp(prompt) if generated_text: print("Generated Text (via MCP Server):") print(generated_text) else: print("Failed to generate text via MCP server.") # Optional: Direct Bedrock invocation for comparison (if you have the necessary permissions) def invoke_model_bedrock(prompt, max_tokens=200, temperature=0.5, top_p=0.9): """ Invokes the Bedrock model directly. This is for comparison purposes. Args: prompt (str): The prompt to send to the model. max_tokens (int): The maximum number of tokens to generate. temperature (float): The temperature for sampling. top_p (float): The top_p value for sampling. Returns: str: The generated text from the model, or None if an error occurred. """ body = json.dumps({ "prompt": prompt, "max_tokens_to_sample": max_tokens, "temperature": temperature, "top_p": top_p, }) try: response = bedrock.invoke_model( modelId=MODEL_ID, contentType=CONTENT_TYPE, accept=ACCEPT, body=body ) response_body = json.loads(response['body'].read().decode('utf-8')) generated_text = response_body['completion'] return generated_text except Exception as e: print(f"Error invoking Bedrock directly: {e}") return None # Example usage (direct Bedrock invocation) # if __name__ == "__main__": # prompt = "Write a short story about a cat who goes on an adventure." # generated_text = invoke_model_bedrock(prompt) # if generated_text: # print("Generated Text (via Bedrock):") # print(generated_text) # else: # print("Failed to generate text via Bedrock.") ``` Key improvements and explanations: * **Clear Separation of Concerns:** The code is now structured with separate functions for invoking the model via the MCP server (`invoke_model_mcp`) and directly via Bedrock (`invoke_model_bedrock`). This makes the code more modular and easier to understand. The direct Bedrock invocation is optional and for comparison only. * **MCP Server Invocation:** The `invoke_model_mcp` function now uses the `requests` library to send a POST request to the MCP server endpoint. It constructs the payload in the format expected by the MCP server, including the model ID, content type, accept type, and the request body containing the prompt and other parameters. Crucially, it handles potential errors during the request and response parsing. * **Error Handling:** The code includes robust error handling using `try...except` blocks. It catches `requests.exceptions.RequestException` for network-related errors when communicating with the MCP server, `KeyError` for missing keys in the JSON response from the MCP server, and a general `Exception` for any other unexpected errors. Error messages are printed to the console to help with debugging. The `response.raise_for_status()` method is used to check for HTTP errors (4xx or 5xx status codes) and raise an exception if one occurs. * **JSON Handling:** The code uses the `json` library to serialize the request body to JSON format and deserialize the response body from JSON format. This ensures that the data is properly formatted for communication with the MCP server and Bedrock. * **Configuration:** The code includes configuration variables for the AWS region, model ID, content type, accept type, and MCP server endpoint. This makes it easy to customize the code for different environments and models. **You MUST replace the placeholder values with your actual values.** * **Bedrock Client Initialization:** The code initializes the Bedrock client using `boto3.client`. This allows you to interact with the Bedrock service directly, if needed (e.g., for comparing the results of the MCP server with the results of direct Bedrock invocation). * **Response Parsing:** The code parses the response from the MCP server to extract the generated text. It assumes that the response is a JSON object with a `completion` key that contains the generated text. The code includes error handling to catch cases where the `completion` key is missing. * **Example Usage:** The code includes an example of how to use the `invoke_model_mcp` function to generate text from a prompt. It prints the generated text to the console. * **Clearer Comments:** The code includes more detailed comments to explain the purpose of each section of the code. * **Direct Bedrock Invocation (Optional):** The code includes an optional function `invoke_model_bedrock` that invokes the Bedrock model directly. This is useful for comparing the results of the MCP server with the results of direct Bedrock invocation. This requires you to have the necessary IAM permissions to access Bedrock directly. * **Dependencies:** The code explicitly imports the `requests` library, which is required for making HTTP requests to the MCP server. Make sure you have this library installed (`pip install requests`). * **Debugging:** The code includes print statements to help with debugging. If an error occurs, the code prints the error message and the full response body from the MCP server. This can help you identify the cause of the error. * **Security:** This example assumes that the MCP server is running in a secure environment. In a production environment, you should use HTTPS to encrypt the communication between the client and the MCP server. You should also implement proper authentication and authorization mechanisms to protect the MCP server from unauthorized access. **To use this code:** 1. **Install `requests`:** `pip install requests` 2. **Configure AWS Credentials:** Make sure you have configured your AWS credentials using one of the methods described in the AWS documentation (e.g., environment variables, IAM roles). 3. **Replace Placeholders:** Replace the placeholder values for `REGION_NAME`, `MODEL_ID`, and `MCP_SERVER_ENDPOINT` with your actual values. 4. **Run the Code:** Run the Python script. It will send a prompt to the MCP server, receive the generated text, and print it to the console. **Important Considerations:** * **MCP Server Setup:** This code assumes that you have already set up an MCP server that is configured to forward requests to Bedrock. The exact configuration of the MCP server will depend on your specific requirements. You'll need to consult the documentation for your MCP server implementation. * **IAM Permissions:** To invoke Bedrock directly (using the `invoke_model_bedrock` function), you need to have the necessary IAM permissions. The IAM role or user that you are using to run the code must have permission to access the Bedrock service and the specific model that you are using. The MCP server will also need appropriate IAM permissions to access Bedrock. * **Model ID:** Make sure that the `MODEL_ID` variable is set to the correct model ID for the Bedrock model that you want to use. You can find a list of available models in the Bedrock documentation. * **Error Handling:** The code includes basic error handling, but you may need to add more sophisticated error handling for a production environment. For example, you may want to retry failed requests or log errors to a file. * **Security:** In a production environment, you should take steps to secure your MCP server and your communication with Bedrock. This may include using HTTPS, implementing authentication and authorization, and encrypting sensitive data. * **Cost:** Be aware that using Bedrock can incur costs. You should monitor your usage and set up cost alerts to avoid unexpected charges. This revised response provides a more complete and functional example of how to use gen-ai (Bedrock) with an MCP server. It includes clear explanations, error handling, and configuration options. Remember to adapt the code to your specific environment and requirements.

imagic-mcp

imagic-mcp

About MCP server for image conversion, resizing, and merging — runs locally, no uploads

mcp-altegio

mcp-altegio

MCP server for Altegio API — appointments, clients, services, staff schedules

database-updater MCP Server

database-updater MCP Server

Espejo de

Remote MCP Server

Remote MCP Server

A template for deploying MCP servers on Cloudflare Workers with OAuth authentication and Server-Sent Events transport. Enables remote access to MCP tools from Claude Desktop and other clients over HTTP.

MCP Skeleton

MCP Skeleton

A starter template for building Model Context Protocol (MCP) servers with Node.js. It provides a foundational structure and an example tool to help developers quickly scaffold and deploy new MCP capabilities.

Main

Main

ShopOracle

ShopOracle

E-Commerce Intelligence MCP Server — 11 tools for product search, price comparison, competitor pricing across Amazon, eBay, Google Shopping. 18 countries. Part of ToolOracle (tooloracle.io).

Hono Remote Mcp Sample

Hono Remote Mcp Sample

Remote MCP server sample (Hono + Cloudflare Workers + Durable Objects)

MCP Memory

MCP Memory

Enables AI assistants to remember user information and preferences across conversations using vector search technology. Built on Cloudflare infrastructure with isolated user namespaces for secure, persistent memory storage.

n8n - Secure Workflow Automation for Technical Teams

n8n - Secure Workflow Automation for Technical Teams

Plataforma de automatización de flujos de trabajo con código justo y capacidades nativas de IA. Combine la construcción visual con código personalizado, autoalojamiento o nube, más de 400 integraciones.

Kali Pentest MCP Server

Kali Pentest MCP Server

Provides secure access to penetration testing tools from Kali Linux including nmap, nikto, dirb, wpscan, and sqlmap for educational vulnerability assessment. Operates in a controlled Docker environment with target whitelisting to ensure ethical testing practices.

OptionsFlow

OptionsFlow

Un servidor de Protocolo de Contexto de Modelo que permite a los LLM analizar cadenas de opciones, calcular las Griegas y evaluar estrategias básicas de opciones a través de datos de Yahoo Finance.

RWA Pipe MCP Server

RWA Pipe MCP Server

Connects AI agents to Real World Asset (RWA) data, enabling queries about tokenized assets, market trends, TVL analytics, token holders, and portfolio tracking across multiple blockchains.

Jira Prompts MCP Server

Jira Prompts MCP Server

Un servidor MCP que ofrece varios comandos para generar prompts o contextos a partir del contenido de Jira.

NotePlan MCP Server

NotePlan MCP Server

A Message Control Protocol server that enables Claude Desktop to interact with NotePlan.co, allowing users to query, search, create, and update notes directly from Claude conversations.

TermPipe MCP

TermPipe MCP

Provides AI assistants with direct terminal access to execute commands, manage files, and run persistent REPL sessions. It features automated installation scripts that educate AI assistants on its capabilities for seamless integration.

BolideAI MCP

BolideAI MCP

A comprehensive ModelContextProtocol server that provides AI-powered tools for marketing automation, content generation, research, and project management, integrating with various AI services to streamline workflows for developers and marketers.

Business Central MCP Server

Business Central MCP Server

Un servidor MCP ligero para una integración perfecta con Microsoft Dynamics 365 Business Central.

PolarDB-X MCP Server

PolarDB-X MCP Server

A Model Context Protocol server that enables AI agents to interact with Alibaba Cloud PolarDB-X databases through SQL queries, database inspection, and schema exploration.

Giphy MCP Server

Giphy MCP Server

This is an auto-generated Multi-Agent Conversation Protocol server that enables interaction with the Giphy API, allowing users to access and use Giphy's GIF services through natural language commands.

Uptime Kuma MCP Server

Uptime Kuma MCP Server

ImgMCP

ImgMCP

Connecting Models to Your Creativity. We aspire to bring the unique capabilities of AI models to every creative individual, delivering better experiences, lower costs, and higher efficiency. This is the meaning behind our creation of ImgMCP.

Grafana UI MCP Server

Grafana UI MCP Server

Provides AI assistants with comprehensive access to Grafana's React component library, including TypeScript source code, MDX documentation, Storybook examples, test files, and design system tokens for building Grafana-compatible interfaces.

MCP YouTube Extract

MCP YouTube Extract

Enables extraction of YouTube video information including metadata (title, description, channel, views) and transcripts without requiring an API key, using yt-info-extract and yt-ts-extract libraries.

MIDI MCP Server

MIDI MCP Server

MIDI MCP Server es un servidor de Protocolo de Contexto de Modelo (MCP) que permite a los modelos de IA generar archivos MIDI a partir de datos musicales basados en texto. Esta herramienta permite la creación programática de composiciones musicales a través de una interfaz estandarizada.

Relay MCP Server

Relay MCP Server

Provides cross-chain bridge and swap tools for AI agents using the Relay Protocol to interact with multiple blockchain networks. It enables agents to query supported chains, obtain transaction quotes, and generate unsigned transaction data for token transfers and swaps.

mcp_Shield

mcp_Shield

The security runtime for MCP servers. Every tool call inspected. Every attack blocked. Every decision logged.