Discover Awesome MCP Servers

Extend your agent with 23,729 capabilities via MCP servers.

All23,729
prometheus-mcp-server

prometheus-mcp-server

A TypeScript-based MCP server that enables users to interact with Prometheus metrics using PromQL queries and discovery tools. It allows LLMs to retrieve time-series data, metadata, alerts, and system status directly from a Prometheus instance.

Graphiti MCP Server 🧠

Graphiti MCP Server 🧠

Weather MCP Server

Weather MCP Server

Provides current weather information for any city worldwide using the free Open-Meteo API, enabling users to query temperature, wind speed, humidity, and weather conditions through natural language.

Blackbaud FE NXT MCP Server by CData

Blackbaud FE NXT MCP Server by CData

This project builds a read-only MCP server. For full read, write, update, delete, and action capabilities and a simplified setup, check out our free CData MCP Server for Blackbaud FE NXT (beta): https://www.cdata.com/download/download.aspx?sku=OZZK-V&type=beta

Wikipedia Summarizer MCP Server

Wikipedia Summarizer MCP Server

Un servidor MCP (Protocolo de Contexto del Modelo) que busca y resume artículos de Wikipedia utilizando LLMs de Ollama, accesible tanto a través de la línea de comandos como de interfaces Streamlit. Perfecto para extraer rápidamente información clave de Wikipedia sin tener que leer artículos completos.

Europe PMC Literature Search MCP Server

Europe PMC Literature Search MCP Server

A professional literature search tool built on FastMCP framework that enables AI assistants to search academic literature from Europe PMC, retrieve article details, and analyze journal quality with seamless integration into Claude Desktop and Cherry Studio.

DateTime MCP Server

DateTime MCP Server

Provides timezone-aware date and time information with configurable time formats and timezone support. Enables users to get current date and time in their preferred timezone and format through simple MCP tools.

MCP DeepSeek 演示项目

MCP DeepSeek 演示项目

Okay, here's a minimal example of using DeepSeek (assuming you mean DeepSeek LLM or a similar model) combined with MCP (MicroConfig Protocol) in a client-server scenario. This is a simplified illustration and would need adaptation based on your specific DeepSeek model and MCP implementation. **Conceptual Overview:** * **MCP (MicroConfig Protocol):** A lightweight protocol for configuration management. In this example, we'll use it to send a prompt to the DeepSeek server and receive the generated text. We'll assume a simple key-value pair structure for MCP messages. * **DeepSeek Server:** A server that hosts the DeepSeek LLM. It receives prompts via MCP, generates text, and sends the response back via MCP. * **Client:** A client that sends a prompt to the DeepSeek server via MCP and displays the response. **Simplified Code Examples (Python):** **1. DeepSeek Server (server.py):** ```python import socket import json # Assuming you have a way to interact with your DeepSeek model # (e.g., using a DeepSeek API or a local model) # Replace this with your actual DeepSeek interaction code. def generate_text_with_deepseek(prompt): """ Placeholder for DeepSeek model interaction. Replace with your actual DeepSeek API call or model inference. """ # Simulate DeepSeek response if "translate" in prompt.lower(): response = "Hola mundo!" # Example Spanish translation else: response = f"DeepSeek says: {prompt}" return response def handle_client(conn, addr): print(f"Connected by {addr}") try: while True: data = conn.recv(1024) # Receive up to 1024 bytes if not data: break try: # MCP: Assume JSON-based key-value pairs message = json.loads(data.decode('utf-8')) prompt = message.get("prompt") if prompt: generated_text = generate_text_with_deepseek(prompt) response_message = {"response": generated_text} conn.sendall(json.dumps(response_message).encode('utf-8')) else: error_message = {"error": "No prompt provided"} conn.sendall(json.dumps(error_message).encode('utf-8')) except json.JSONDecodeError: error_message = {"error": "Invalid JSON format"} conn.sendall(json.dumps(error_message).encode('utf-8')) except Exception as e: error_message = {"error": f"Server error: {str(e)}"} conn.sendall(json.dumps(error_message).encode('utf-8')) except ConnectionResetError: print(f"Connection reset by {addr}") finally: conn.close() print(f"Connection closed with {addr}") def start_server(): HOST = "127.0.0.1" # Standard loopback interface address (localhost) PORT = 65432 # Port to listen on (non-privileged ports are > 1023) with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as s: s.bind((HOST, PORT)) s.listen() print(f"Listening on {HOST}:{PORT}") while True: conn, addr = s.accept() handle_client(conn, addr) if __name__ == "__main__": start_server() ``` **2. Client (client.py):** ```python import socket import json HOST = "127.0.0.1" # The server's hostname or IP address PORT = 65432 # The port used by the server def send_prompt(prompt): with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as s: try: s.connect((HOST, PORT)) message = {"prompt": prompt} s.sendall(json.dumps(message).encode('utf-8')) data = s.recv(1024) response = json.loads(data.decode('utf-8')) print(f"Received: {response}") except ConnectionRefusedError: print("Connection refused. Is the server running?") except Exception as e: print(f"Error: {e}") if __name__ == "__main__": user_prompt = input("Enter a prompt for DeepSeek: ") send_prompt(user_prompt) ``` **How to Run:** 1. **Install Dependencies:** You'll need Python installed. No external libraries are strictly required for this minimal example, but you'll need to install the DeepSeek API client library if you're using a remote DeepSeek service. 2. **Replace Placeholder:** In `server.py`, **replace the `generate_text_with_deepseek` function with your actual DeepSeek model interaction code.** This is the crucial part where you integrate with the DeepSeek LLM. This will likely involve using an API key, setting model parameters, and handling the API response. 3. **Start the Server:** Run `python server.py` in a terminal. 4. **Run the Client:** Run `python client.py` in another terminal. The client will prompt you to enter text. Type a prompt and press Enter. The client will send the prompt to the server, the server will process it with DeepSeek, and the client will display the response. **Explanation:** * **MCP (Simplified):** The code uses JSON to encode and decode messages between the client and server. This is a simple form of MCP. A real MCP implementation might have more sophisticated features like versioning, error handling, and data validation. * **Sockets:** The code uses Python's `socket` module for network communication. The server listens for incoming connections, and the client connects to the server. * **Error Handling:** The code includes basic error handling for connection errors, JSON parsing errors, and server-side exceptions. * **DeepSeek Integration (Placeholder):** The `generate_text_with_deepseek` function is a placeholder. You **must** replace it with your actual DeepSeek API call or model inference code. This is the core of the integration. **Important Considerations:** * **DeepSeek API/Model:** This example assumes you have access to a DeepSeek LLM, either through an API or a local model. You'll need to obtain API credentials or set up your local model environment. * **Security:** This is a very basic example and does not include any security measures. In a production environment, you would need to implement authentication, authorization, and encryption. * **Scalability:** This example is not designed for high scalability. For production use, you would need to consider using a more robust server architecture and load balancing. * **MCP Implementation:** This example uses a very simple JSON-based MCP. A real MCP implementation might use a more efficient binary format or a more sophisticated protocol. * **Error Handling:** The error handling is basic. You should add more robust error handling and logging for production use. **Example Interaction:** 1. **Run `server.py`** 2. **Run `client.py`** 3. **Client Prompt:** `Translate "Hello world!" to Spanish` 4. **Client Output:** `Received: {'response': 'Hola mundo!'}` This minimal example provides a starting point for integrating DeepSeek with MCP. You'll need to adapt it to your specific requirements and environment. Remember to replace the placeholder code with your actual DeepSeek interaction logic. Good luck!

Civic Data MCP Server

Civic Data MCP Server

Provides access to 7 free government and open data APIs including NOAA weather, US Census demographics, NASA imagery, World Bank economics, Data.gov, and EU Open Data through 22 specialized tools, with most requiring no API keys.

Agent Identity Protocol (AIP)

Agent Identity Protocol (AIP)

Provides cryptographic identity and signing capabilities for AI agents, enabling them to create persistent identities, sign actions with private keys, and allow external systems to verify the authenticity and provenance of agent-initiated operations.

mcp-diagram

mcp-diagram

This is a bit ambiguous. "MCP server" doesn't have a standard meaning in the context of diagramming. To give you the best translation, I need to understand what you mean by "MCP server." Here are a few possibilities and their translations: **1. If you mean a server that *hosts* a diagramming application or service (like a web server running a diagramming tool):** * **Spanish:** *Servidor para diagramación* (This is the most general and likely correct translation) * **Spanish (more specific):** *Servidor de aplicaciones para diagramación* (Application server for diagramming) * **Spanish (if it's a web server):** *Servidor web para diagramación* (Web server for diagramming) **2. If "MCP" is an acronym for a specific diagramming software or protocol (you'll need to tell me what it stands for):** * **Example (if MCP stood for "My Cool Program"):** *Servidor de My Cool Program para diagramación* (My Cool Program server for diagramming) - You would replace "My Cool Program" with the actual name. **3. If you're referring to a server that *processes* diagrams (e.g., a server that takes diagram data as input and generates an image or performs some analysis):** * **Spanish:** *Servidor de procesamiento de diagramas* (Diagram processing server) **4. If you're referring to a server that *stores* diagrams:** * **Spanish:** *Servidor de almacenamiento de diagramas* (Diagram storage server) **To help me give you the best translation, please tell me:** * **What does "MCP" stand for (if anything)?** * **What is the server *doing* in relation to the diagramming?** Is it hosting the application, processing the diagrams, storing them, or something else? Once I have this information, I can provide a much more accurate and helpful translation.

Model Context Protocol Multi-Agent Server

Model Context Protocol Multi-Agent Server

Demonstrates custom MCP servers for math and weather operations, enabling multi-agent orchestration using LangChain, Groq, and MCP adapters for both local and remote tool integration.

Clear Thought MCP Server

Clear Thought MCP Server

Provides systematic thinking tools including mental models, design patterns, debugging approaches, and collaborative reasoning frameworks to enhance problem-solving and decision-making capabilities.

Moonshot MCP Server Gateway

Moonshot MCP Server Gateway

A lightweight gateway server that provides a unified connection entry point for accessing multiple MCP servers, supporting various protocols including Network and Local Transports.

n8n-MCP

n8n-MCP

Provides AI assistants with comprehensive access to n8n's 525+ workflow automation nodes, including documentation, properties, operations, and 2,500+ templates. Enables creating, validating, and managing n8n workflows through natural language.

Artur's Model Context Protocol servers

Artur's Model Context Protocol servers

Servidores MCP

Mcp Use

Mcp Use

lemon-squeezy-mcp

lemon-squeezy-mcp

Universal Semantic Bridge for Lemon Squeezy: A high-performance Model Context Protocol (MCP) server that empowers AI assistants (Cursor, Claude, VS Code) to query payments, manage subscriptions, and sync customers to Salesforce directly from your editor. 🍋✨

MCP Memory

MCP Memory

Enables AI assistants to remember user information across conversations using vector search technology. Built on Cloudflare infrastructure with isolated user namespaces for secure, persistent memory storage and retrieval.

Elasticsearch MCP Server Solution

Elasticsearch MCP Server Solution

Enables comprehensive interaction with Elasticsearch APIs through natural language queries, specifically optimized for security analysis, threat detection, incident investigation, and compliance monitoring with advanced machine learning capabilities for anomaly detection.

OpenEnded Philosophy MCP Server

OpenEnded Philosophy MCP Server

Enables philosophical reasoning and concept analysis through NARS non-axiomatic logic integration, supporting multi-perspective synthesis, epistemic uncertainty tracking, and contextual semantic exploration with built-in truth maintenance.

vet-mcp

vet-mcp

vet-mcp

MCP サーバー/クライアント サンプル

MCP サーバー/クライアント サンプル

Mingli MCP Server

Mingli MCP Server

Enables AI tools to perform Chinese fortune-telling analysis including Ziwei Doushu (Purple Star Astrology) and Bazi (Four Pillars) chart generation, fortune reading, and element analysis. Supports multiple calendar systems and output formats for comprehensive divination services.

@mcp/openverse

@mcp/openverse

An MCP server that enables searching and fetching openly-licensed images from Openverse with features like filtering by license type, getting image details, and finding essay-specific illustrations.

OCI Core Services FastMCP Server

OCI Core Services FastMCP Server

A dedicated server for Oracle Cloud Infrastructure (OCI) Core Services that enables management of compute instances and network operations with LLM-friendly structured responses.

mcp-server-email MCP server

mcp-server-email MCP server

I am a language model, and I don't have an email address. I am accessed through interfaces provided by Google.

Singapore News MCP Server

Singapore News MCP Server

Provides real-time news feeds from major Singapore news sources including The Straits Times, Business Times, and Channel News Asia. Delivers live news updates through Server-Sent Events for up-to-date information access.

Consult LLM MCP

Consult LLM MCP

An MCP server that lets Claude Code consult stronger AI models (o3, Gemini 2.5 Pro, DeepSeek Reasoner) when you need deeper analysis on complex problems.

BloodHound MCP

BloodHound MCP

Una extensión que permite a los Modelos de Lenguaje Extensos interactuar y analizar entornos de Active Directory a través de consultas en lenguaje natural en lugar de consultas Cypher manuales.