Discover Awesome MCP Servers
Extend your agent with 20,552 capabilities via MCP servers.
- All20,552
- Developer Tools3,867
- Search1,714
- Research & Data1,557
- AI Integration Systems229
- Cloud Platforms219
- Data & App Analysis181
- Database Interaction177
- Remote Shell Execution165
- Browser Automation147
- Databases145
- Communication137
- AI Content Generation127
- OS Automation120
- Programming Docs Access109
- Content Fetching108
- Note Taking97
- File Systems96
- Version Control93
- Finance91
- Knowledge & Memory90
- Monitoring79
- Security71
- Image & Video Processing69
- Digital Note Management66
- AI Memory Systems62
- Advanced AI Reasoning59
- Git Management Tools58
- Cloud Storage51
- Entertainment & Media43
- Virtualization42
- Location Services35
- Web Automation & Stealth32
- Media Content Processing32
- Calendar Management26
- Ecommerce & Retail18
- Speech Processing18
- Customer Data Platforms16
- Travel & Transportation14
- Education & Learning Tools13
- Home Automation & IoT13
- Web Search Integration12
- Health & Wellness10
- Customer Support10
- Marketing9
- Games & Gamification8
- Google Cloud Integrations7
- Art & Culture4
- Language Translation3
- Legal & Compliance2
XMI MCP Server
An MCP server for querying and exploring SysML XMI models, specifically designed for MTConnect model exports. It allows users to search for packages, classes, and enumerations while providing tools for analyzing documentation and inheritance hierarchies.
Wikipedia Summarizer MCP Server
Un servidor MCP (Protocolo de Contexto del Modelo) que busca y resume artículos de Wikipedia utilizando LLMs de Ollama, accesible tanto a través de la línea de comandos como de interfaces Streamlit. Perfecto para extraer rápidamente información clave de Wikipedia sin tener que leer artículos completos.
DateTime MCP Server
Provides timezone-aware date and time information with configurable time formats and timezone support. Enables users to get current date and time in their preferred timezone and format through simple MCP tools.
MCP DeepSeek 演示项目
Okay, here's a minimal example of using DeepSeek (assuming you mean DeepSeek LLM or a similar model) combined with MCP (MicroConfig Protocol) in a client-server scenario. This is a simplified illustration and would need adaptation based on your specific DeepSeek model and MCP implementation. **Conceptual Overview:** * **MCP (MicroConfig Protocol):** A lightweight protocol for configuration management. In this example, we'll use it to send a prompt to the DeepSeek server and receive the generated text. We'll assume a simple key-value pair structure for MCP messages. * **DeepSeek Server:** A server that hosts the DeepSeek LLM. It receives prompts via MCP, generates text, and sends the response back via MCP. * **Client:** A client that sends a prompt to the DeepSeek server via MCP and displays the response. **Simplified Code Examples (Python):** **1. DeepSeek Server (server.py):** ```python import socket import json # Assuming you have a way to interact with your DeepSeek model # (e.g., using a DeepSeek API or a local model) # Replace this with your actual DeepSeek interaction code. def generate_text_with_deepseek(prompt): """ Placeholder for DeepSeek model interaction. Replace with your actual DeepSeek API call or model inference. """ # Simulate DeepSeek response if "translate" in prompt.lower(): response = "Hola mundo!" # Example Spanish translation else: response = f"DeepSeek says: {prompt}" return response def handle_client(conn, addr): print(f"Connected by {addr}") try: while True: data = conn.recv(1024) # Receive up to 1024 bytes if not data: break try: # MCP: Assume JSON-based key-value pairs message = json.loads(data.decode('utf-8')) prompt = message.get("prompt") if prompt: generated_text = generate_text_with_deepseek(prompt) response_message = {"response": generated_text} conn.sendall(json.dumps(response_message).encode('utf-8')) else: error_message = {"error": "No prompt provided"} conn.sendall(json.dumps(error_message).encode('utf-8')) except json.JSONDecodeError: error_message = {"error": "Invalid JSON format"} conn.sendall(json.dumps(error_message).encode('utf-8')) except Exception as e: error_message = {"error": f"Server error: {str(e)}"} conn.sendall(json.dumps(error_message).encode('utf-8')) except ConnectionResetError: print(f"Connection reset by {addr}") finally: conn.close() print(f"Connection closed with {addr}") def start_server(): HOST = "127.0.0.1" # Standard loopback interface address (localhost) PORT = 65432 # Port to listen on (non-privileged ports are > 1023) with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as s: s.bind((HOST, PORT)) s.listen() print(f"Listening on {HOST}:{PORT}") while True: conn, addr = s.accept() handle_client(conn, addr) if __name__ == "__main__": start_server() ``` **2. Client (client.py):** ```python import socket import json HOST = "127.0.0.1" # The server's hostname or IP address PORT = 65432 # The port used by the server def send_prompt(prompt): with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as s: try: s.connect((HOST, PORT)) message = {"prompt": prompt} s.sendall(json.dumps(message).encode('utf-8')) data = s.recv(1024) response = json.loads(data.decode('utf-8')) print(f"Received: {response}") except ConnectionRefusedError: print("Connection refused. Is the server running?") except Exception as e: print(f"Error: {e}") if __name__ == "__main__": user_prompt = input("Enter a prompt for DeepSeek: ") send_prompt(user_prompt) ``` **How to Run:** 1. **Install Dependencies:** You'll need Python installed. No external libraries are strictly required for this minimal example, but you'll need to install the DeepSeek API client library if you're using a remote DeepSeek service. 2. **Replace Placeholder:** In `server.py`, **replace the `generate_text_with_deepseek` function with your actual DeepSeek model interaction code.** This is the crucial part where you integrate with the DeepSeek LLM. This will likely involve using an API key, setting model parameters, and handling the API response. 3. **Start the Server:** Run `python server.py` in a terminal. 4. **Run the Client:** Run `python client.py` in another terminal. The client will prompt you to enter text. Type a prompt and press Enter. The client will send the prompt to the server, the server will process it with DeepSeek, and the client will display the response. **Explanation:** * **MCP (Simplified):** The code uses JSON to encode and decode messages between the client and server. This is a simple form of MCP. A real MCP implementation might have more sophisticated features like versioning, error handling, and data validation. * **Sockets:** The code uses Python's `socket` module for network communication. The server listens for incoming connections, and the client connects to the server. * **Error Handling:** The code includes basic error handling for connection errors, JSON parsing errors, and server-side exceptions. * **DeepSeek Integration (Placeholder):** The `generate_text_with_deepseek` function is a placeholder. You **must** replace it with your actual DeepSeek API call or model inference code. This is the core of the integration. **Important Considerations:** * **DeepSeek API/Model:** This example assumes you have access to a DeepSeek LLM, either through an API or a local model. You'll need to obtain API credentials or set up your local model environment. * **Security:** This is a very basic example and does not include any security measures. In a production environment, you would need to implement authentication, authorization, and encryption. * **Scalability:** This example is not designed for high scalability. For production use, you would need to consider using a more robust server architecture and load balancing. * **MCP Implementation:** This example uses a very simple JSON-based MCP. A real MCP implementation might use a more efficient binary format or a more sophisticated protocol. * **Error Handling:** The error handling is basic. You should add more robust error handling and logging for production use. **Example Interaction:** 1. **Run `server.py`** 2. **Run `client.py`** 3. **Client Prompt:** `Translate "Hello world!" to Spanish` 4. **Client Output:** `Received: {'response': 'Hola mundo!'}` This minimal example provides a starting point for integrating DeepSeek with MCP. You'll need to adapt it to your specific requirements and environment. Remember to replace the placeholder code with your actual DeepSeek interaction logic. Good luck!
LPDP MCP Server
Enables users to query information about LPDP scholarship financial disbursement using RAG with Pinecone vector search and Gemini 2.0 Flash, answering questions about funding components, deadlines, living allowances, and required documents.
Test MCP Feb4 MCP Server
An MCP server that provides standardized tools for AI agents to interact with the Test MCP Feb4 API. It enables LLMs to access API endpoints through asynchronous operations and standardized Model Context Protocol tools.
ConsignCloud MCP Server
Enables AI assistants to manage consignment and retail business operations through the ConsignCloud API, including inventory management, sales tracking, vendor accounts, and analytics.
Gmail MCP Server
Provides comprehensive Gmail integration with 25+ tools for intelligent email management, including AI-powered categorization, advanced search and filtering, automated archiving and cleanup, analytics, and secure OAuth2 authentication.
ralph-wiggum-mcp
An enhanced Model Context Protocol (MCP) server implementing the Ralph Wiggum technique for iterative, self-referential AI development loops.
GIT MCP Server
Un servidor de Protocolo de Contexto de Modelo (MCP) para proporcionar herramientas de git para Agentes LLM, con correcciones para el problema de almacenamiento en caché del parámetro amend.
LLDB MCP Server
Provides structured debugging capabilities through LLDB, enabling AI assistants to set breakpoints, inspect variables, analyze crashes, disassemble code, and evaluate expressions in C/C++ programs.
Daniel LightRAG MCP Server
A comprehensive MCP server that provides full integration with LightRAG API, offering 22 tools across document management, querying, knowledge graph operations, and system management.
Code Graph Knowledge System
Transforms code repositories and development documentation into a queryable Neo4j knowledge graph, enabling AI assistants to perform intelligent code analysis, dependency mapping, impact assessment, and automated documentation generation across 15+ programming languages.
Email-MCP
An MCP server that enables AI assistants to send emails through SMTP protocol by providing server credentials, recipient information, subject, and message content.
Strava MCP Server
Un servidor de TypeScript que actúa como puente a la API de Strava, permitiendo a los LLM (Modelos de Lenguaje Grandes) acceder a las actividades, rutas, segmentos y datos de atletas de los usuarios a través de la interacción en lenguaje natural.
CereBro
Un cliente-servidor MCP independiente del modelo para .Net
Math & Calculator MCP Server
Provides advanced mathematical utilities including basic arithmetic, statistical analysis, unit conversions, quadratic equation solving, percentage calculations, and trigonometric functions for AI assistants.
🤖 mcp-ollama-beeai
Una aplicación agentica mínima para interactuar con modelos OLLAMA, aprovechando múltiples herramientas de servidor MCP utilizando el framework BeeAI.
Google AdWords MCP Server by CData
Google AdWords MCP Server by CData
Google Forms MCP Server with CamelAIOrg Agents Integration
DuckDuckGo Search MCP Server
Espejo de
Typesense MCP Server
A server that enables vector and keyword search capabilities in Typesense databases through the Model Context Protocol, providing tools for collection management, document operations, and search functionality.
Domain Checker
Enables checking domain name availability using WHOIS lookups and DNS resolution. Supports both single and batch domain checking with detailed availability analysis.
Multi-Provider MCP Server
Servidor MCP personalizado para integrarse con herramientas y ejecutarse dentro de una instancia de n8n.
GigAPI MCP Server
An MCP server that provides seamless integration with Claude Desktop for querying and managing timeseries data in GigAPI Timeseries Lake.
mcp-strategy-research-db
An MCP server that provides access to a SQLite database for analyzing trading strategy backtest results and performance metrics. It enables AI assistants to identify robust strategies across different market regimes and compare them against benchmarks using risk-adjusted metrics.
MCP Audio Tweaker
Enables batch audio processing and optimization using FFmpeg with preset configurations for game audio, voice processing, and music mastering, including specialized optimization for ElevenLabs AI voice output.
Artur's Model Context Protocol servers
Servidores MCP
Memory MCP Server
Enables agents to maintain persistent memory through three-tiered architecture: short-term session context with TTL, long-term user profiles and preferences, and searchable episodic event history with sentiment analysis. Provides comprehensive memory management for personalized AI interactions.
WooCommerce Enterprise MCP Suite
Provides 115+ MCP tools for comprehensive WooCommerce store management including multi-store operations, bulk processing, inventory sync, order management, and customer analytics. Features enterprise-level safety controls with dry-run mode, automatic backups, and rollback capabilities.