Discover Awesome MCP Servers

Extend your agent with 20,526 capabilities via MCP servers.

All20,526
MCP Vaultwarden Server

MCP Vaultwarden Server

Enables AI agents and automation scripts to securely interact with self-hosted Vaultwarden instances through the Bitwarden CLI, automatically managing vault sessions and providing tools to read, create, update, and delete secrets programmatically.

Agent MCP

Agent MCP

A Multi-Agent Collaboration Protocol server that enables coordinated AI collaboration through task management, context sharing, and agent interaction visualization.

MCP Weather Server

MCP Weather Server

A containerized server that provides weather tools for AI assistants, allowing them to access US weather alerts and forecasts through the National Weather Service API.

ChatRPG

ChatRPG

A lightweight ChatGPT app that converts your LLM into a Dungeon Master!

FastMCP Demo Server

FastMCP Demo Server

A production-ready MCP server that provides hackathon resources and reusable starter prompts. Built with FastMCP framework and includes comprehensive deployment options for development and production environments.

Outlook MCP Server

Outlook MCP Server

Enables interaction with Outlook email through Microsoft Graph API. Supports email management operations like reading, searching, marking as read/unread, and deleting messages through natural language.

Fetch-Save MCP Server

Fetch-Save MCP Server

A Model Context Protocol server that enables LLMs to retrieve web content and save it to local files for permanent storage and later access.

TypeScript MCP Server Boilerplate

TypeScript MCP Server Boilerplate

A boilerplate project for quickly developing Model Context Protocol (MCP) servers using TypeScript SDK, with examples of tools (calculator, greeting) and resources (server info).

WorkItems DevOps MCP Server

WorkItems DevOps MCP Server

Enables LLMs and AI applications to interact with Azure DevOps Work Items, supporting queries, filtering, status updates, date management, effort tracking, descriptions, and comments through natural language.

X MCP Server

X MCP Server

Enables users to interact with X (Twitter) through the X API. Supports posting tweets, retrieving user timelines, searching tweets, and replying to tweets with comprehensive error handling.

LINE Bot MCP Server

LINE Bot MCP Server

Model Context Protocol server implementation that integrates the LINE Messaging API to connect AI agents with LINE Official Accounts, enabling agents to send messages to users.

Genesis MCP Server

Genesis MCP Server

A template for deploying remote MCP servers on Cloudflare Workers without authentication. Provides a foundation for building custom MCP tools that can be accessed from Claude Desktop or the Cloudflare AI Playground.

Display & Video 360 API MCP Server

Display & Video 360 API MCP Server

An MCP server that enables interaction with Google's Display & Video 360 advertising platform API, allowing management of digital advertising campaigns through natural language commands.

Maiga API MCP Server

Maiga API MCP Server

Provides comprehensive integration with the Maiga API for cryptocurrency analysis, including token technicals, social sentiment tracking, and KOL insights. It enables AI assistants to retrieve market reports, trending token data, and detailed on-chain information.

MCP MySQL Server

MCP MySQL Server

Enables interaction with MySQL databases (including AWS RDS and cloud instances) through natural language. Supports database connections, query execution, schema inspection, and comprehensive database management operations.

Meraki Magic MCP

Meraki Magic MCP

A Python-based MCP server that enables querying Cisco's Meraki Dashboard API to discover, monitor, and manage Meraki environments.

Cursor Rust Tools

Cursor Rust Tools

Um servidor MCP para permitir que o LLM no Cursor acesse o Rust Analyzer, a documentação de Crate e os comandos Cargo.

MCP Server for MySQL

MCP Server for MySQL

Provides access to MySQL databases with fine-grained access control, supporting multiple databases simultaneously with configurable access modes (readonly, readwrite, full) and table-level permissions using whitelists, blacklists, wildcards, and regex patterns.

Markdown MCP Server

Markdown MCP Server

An MCP (Model Context Protocol) server for efficiently managing Markdown documents in Cursor AI IDE, supporting CRUD operations, search, and metadata management.

MCP with Langchain Sample Setup

MCP with Langchain Sample Setup

Okay, here's a sample setup for an MCP (Modular Component Protocol) server and client, designed to be compatible with LangChain. This example focuses on a simple "summarization" task, but you can adapt it to other LangChain functionalities. **Important Considerations:** * **MCP (Modular Component Protocol):** MCP isn't a widely standardized protocol. This example uses a simplified, custom implementation based on JSON over HTTP for demonstration purposes. In a real-world scenario, you might consider more robust solutions like gRPC, Thrift, or even well-defined REST APIs. * **LangChain Integration:** The key is to use LangChain components (e.g., LLMs, chains, document loaders) *within* the MCP server to process requests. The client sends data, the server uses LangChain to process it, and the server returns the result. * **Error Handling:** This is a simplified example. Robust error handling (try-except blocks, logging, proper HTTP status codes) is crucial in a production environment. * **Security:** This example lacks security measures (authentication, authorization). Implement appropriate security based on your needs. * **Asynchronous Operations:** For more complex tasks, consider using asynchronous operations (e.g., `asyncio` in Python) to improve performance and prevent blocking. **Python Code (Example):** **1. MCP Server (using Flask):** ```python from flask import Flask, request, jsonify from langchain.llms import OpenAI from langchain.chains.summarize import load_summarize_chain from langchain.document_loaders import TextLoader # Or other loaders from langchain.text_splitter import CharacterTextSplitter import os # Set your OpenAI API key (or use environment variables) os.environ["OPENAI_API_KEY"] = "YOUR_OPENAI_API_KEY" # Replace with your actual key app = Flask(__name__) @app.route('/summarize', methods=['POST']) def summarize_text(): try: data = request.get_json() text = data.get('text') if not text: return jsonify({'error': 'Missing "text" parameter'}), 400 # LangChain components llm = OpenAI(temperature=0) # Adjust temperature as needed summarize_chain = load_summarize_chain(llm, chain_type="map_reduce") # or "stuff", "refine" # Prepare the document (using a dummy TextLoader for demonstration) # In a real scenario, you might load from a file, database, etc. # For large texts, split into chunks text_splitter = CharacterTextSplitter(chunk_size=1000, chunk_overlap=0) texts = text_splitter.split_text(text) # Create LangChain documents from the text chunks from langchain.docstore.document import Document docs = [Document(page_content=t) for t in texts] # Run the summarization chain summary = summarize_chain.run(docs) return jsonify({'summary': summary}) except Exception as e: print(f"Error: {e}") # Log the error return jsonify({'error': str(e)}), 500 # Return error with status code if __name__ == '__main__': app.run(debug=True, host='0.0.0.0', port=5000) # Make accessible on network ``` **2. MCP Client (using `requests`):** ```python import requests import json def summarize_with_mcp(text, server_url="http://localhost:5000/summarize"): """ Sends text to the MCP server for summarization. Args: text: The text to summarize. server_url: The URL of the MCP server's summarize endpoint. Returns: The summary from the server, or None if there was an error. """ try: payload = {'text': text} headers = {'Content-Type': 'application/json'} response = requests.post(server_url, data=json.dumps(payload), headers=headers) response.raise_for_status() # Raise HTTPError for bad responses (4xx or 5xx) data = response.json() return data.get('summary') except requests.exceptions.RequestException as e: print(f"Error connecting to server: {e}") return None except json.JSONDecodeError as e: print(f"Error decoding JSON response: {e}") return None except Exception as e: print(f"An unexpected error occurred: {e}") return None if __name__ == '__main__': sample_text = """ This is a long piece of text that needs to be summarized. It contains important information about LangChain and MCP. LangChain is a powerful framework for building applications using large language models. MCP, in this context, is a simple protocol for communication between a client and a server. The server uses LangChain to process requests from the client. This example demonstrates a basic summarization task. More complex tasks can be implemented using different LangChain components and chains. Error handling and security are important considerations for production deployments. """ summary = summarize_with_mcp(sample_text) if summary: print("Summary:", summary) else: print("Failed to get summary.") ``` **Explanation:** * **Server (Flask):** * Uses Flask to create a simple HTTP server. * The `/summarize` endpoint receives POST requests with a JSON payload containing the `text` to summarize. * It initializes LangChain components: `OpenAI` (the LLM) and `load_summarize_chain` (the summarization chain). You'll need an OpenAI API key. * It loads the text into a LangChain `Document`. For larger texts, it splits the text into chunks using `CharacterTextSplitter`. * It runs the summarization chain and returns the summary in a JSON response. * Includes basic error handling. * **Client:** * Uses the `requests` library to send a POST request to the server's `/summarize` endpoint. * It packages the text to be summarized in a JSON payload. * It handles potential errors during the request (e.g., connection errors, bad responses). * It prints the summary received from the server. **How to Run:** 1. **Install Dependencies:** ```bash pip install flask langchain openai requests tiktoken ``` 2. **Set OpenAI API Key:** Replace `"YOUR_OPENAI_API_KEY"` in the server code with your actual OpenAI API key. Consider using environment variables for security. 3. **Run the Server:** ```bash python your_server_file.py # e.g., python mcp_server.py ``` 4. **Run the Client:** ```bash python your_client_file.py # e.g., python mcp_client.py ``` **Key Adaptations for Different LangChain Tasks:** * **Different Chains:** Instead of `load_summarize_chain`, use other LangChain chains (e.g., `LLMChain`, `ConversationalRetrievalChain`) based on the task you want to perform. * **Different LLMs:** Use other LLMs besides `OpenAI` (e.g., `HuggingFaceHub`, `Cohere`). You'll need to install the appropriate LangChain integration and configure the LLM. * **Data Loading:** Use different LangChain document loaders (e.g., `WebBaseLoader`, `CSVLoader`, `PDFMinerLoader`) to load data from various sources. * **Input/Output:** Adjust the input and output data formats in the server and client to match the requirements of your task. For example, you might send a question and receive an answer, or send a list of documents and receive a ranked list of relevant documents. * **Prompt Engineering:** Carefully design the prompts used in your LangChain chains to achieve the desired results. **Example in Portuguese (Translation of the Explanation):** Aqui está uma configuração de exemplo para um servidor e cliente MCP (Modular Component Protocol), projetada para ser compatível com LangChain. Este exemplo se concentra em uma tarefa simples de "resumo", mas você pode adaptá-lo para outras funcionalidades do LangChain. **Considerações Importantes:** * **MCP (Modular Component Protocol):** MCP não é um protocolo amplamente padronizado. Este exemplo usa uma implementação personalizada simplificada baseada em JSON sobre HTTP para fins de demonstração. Em um cenário do mundo real, você pode considerar soluções mais robustas como gRPC, Thrift ou até mesmo APIs REST bem definidas. * **Integração com LangChain:** A chave é usar componentes LangChain (por exemplo, LLMs, chains, carregadores de documentos) *dentro* do servidor MCP para processar solicitações. O cliente envia dados, o servidor usa LangChain para processá-los e o servidor retorna o resultado. * **Tratamento de Erros:** Este é um exemplo simplificado. O tratamento robusto de erros (blocos try-except, registro, códigos de status HTTP adequados) é crucial em um ambiente de produção. * **Segurança:** Este exemplo carece de medidas de segurança (autenticação, autorização). Implemente a segurança apropriada com base em suas necessidades. * **Operações Assíncronas:** Para tarefas mais complexas, considere usar operações assíncronas (por exemplo, `asyncio` em Python) para melhorar o desempenho e evitar bloqueios. **Código Python (Exemplo):** (O código Python permaneceria o mesmo, pois é código e não precisa ser traduzido. Apenas a explicação é traduzida.) **Explicação:** * **Servidor (Flask):** * Usa Flask para criar um servidor HTTP simples. * O endpoint `/summarize` recebe solicitações POST com um payload JSON contendo o `text` a ser resumido. * Ele inicializa os componentes LangChain: `OpenAI` (o LLM) e `load_summarize_chain` (a chain de resumo). Você precisará de uma chave de API OpenAI. * Ele carrega o texto em um `Document` LangChain. Para textos maiores, ele divide o texto em partes usando `CharacterTextSplitter`. * Ele executa a chain de resumo e retorna o resumo em uma resposta JSON. * Inclui tratamento de erros básico. * **Cliente:** * Usa a biblioteca `requests` para enviar uma solicitação POST para o endpoint `/summarize` do servidor. * Ele empacota o texto a ser resumido em um payload JSON. * Ele lida com possíveis erros durante a solicitação (por exemplo, erros de conexão, respostas ruins). * Ele imprime o resumo recebido do servidor. **Como Executar:** (As instruções de execução permanecem as mesmas, pois são comandos e não precisam ser traduzidas.) **Principais Adaptações para Diferentes Tarefas LangChain:** * **Chains Diferentes:** Em vez de `load_summarize_chain`, use outras chains LangChain (por exemplo, `LLMChain`, `ConversationalRetrievalChain`) com base na tarefa que você deseja executar. * **LLMs Diferentes:** Use outros LLMs além de `OpenAI` (por exemplo, `HuggingFaceHub`, `Cohere`). Você precisará instalar a integração LangChain apropriada e configurar o LLM. * **Carregamento de Dados:** Use diferentes carregadores de documentos LangChain (por exemplo, `WebBaseLoader`, `CSVLoader`, `PDFMinerLoader`) para carregar dados de várias fontes. * **Entrada/Saída:** Ajuste os formatos de dados de entrada e saída no servidor e no cliente para corresponder aos requisitos de sua tarefa. Por exemplo, você pode enviar uma pergunta e receber uma resposta, ou enviar uma lista de documentos e receber uma lista classificada de documentos relevantes. * **Engenharia de Prompt:** Projete cuidadosamente os prompts usados em suas chains LangChain para obter os resultados desejados. This provides a basic framework. Remember to adapt it to your specific use case and add proper error handling, security, and performance optimizations. Good luck!

Hurricane Tracker MCP Server

Hurricane Tracker MCP Server

Provides real-time hurricane tracking, 5-day forecast cones, location-based alerts, and historical storm data from NOAA/NHC through MCP tools for AI assistants.

Another Planka MCP

Another Planka MCP

An MCP server that enables AI assistants to interact with self-hosted Planka boards through the Planka REST API. It allows users to list, search, create, and update projects, boards, and cards using natural language.

TripNow (航班管家)

TripNow (航班管家)

Provides real-time flight and train ticket queries, live status tracking, and a professional knowledge base for air and rail travel. It enables users to monitor departures, check ticketing policies, and manage travel information through a unified interface.

Sequential Questioning MCP Server

Sequential Questioning MCP Server

A specialized server that enables LLMs to gather specific information through sequential questioning, implementing the MCP standard for seamless integration with LLM clients.

arc-mcp

arc-mcp

An MCP server for the Arc browser that enables programmatic management of spaces and tabs. It supports actions like listing, creating, and deleting spaces and tabs, as well as focusing spaces and opening URLs via AppleScript.

MCP Server for Veryfi Document Processing

MCP Server for Veryfi Document Processing

Servidor de protocolo de contexto de modelo com acesso à API Veryfi

MCP Finder Server

MCP Finder Server

Remote MCP Server on Cloudflare

Remote MCP Server on Cloudflare

Ticket Tailor API Integration

Ticket Tailor API Integration

mcp_sdk_petstore_api_44

mcp_sdk_petstore_api_44

A standalone MCP server generated from an OpenAPI specification that exposes Petstore API endpoints as tools for AI assistants. It utilizes SSE transport to enable models to interact with pet store management functionalities through natural language.