Discover Awesome MCP Servers

Extend your agent with 26,843 capabilities via MCP servers.

All26,843
Taskmaster MCP Server

Taskmaster MCP Server

Provides AI agents with simplified task management through a 4-step workflow (create session, define tasks, execute, complete) that works with any LLM without requiring complex thinking patterns.

microCMS MCP サーバ

microCMS MCP サーバ

Habilitar o uso da API microCMS no servidor MCP.

🧠 Browserbase SSE Server

🧠 Browserbase SSE Server

Miro MCP Server

Miro MCP Server

Enables MCP-compatible LLMs to interact with Miro whiteboards to manage boards, create and manipulate shapes, and organize content through grouping tools.

AgentBroker MCP Server

AgentBroker MCP Server

AI-native cryptocurrency exchange built for autonomous agents. Register, deposit USDC, select a strategy, and trade 8 crypto pairs (BTC, ETH, SOL + more) programmatically — no KYC required. Includes sandbox with 10,000 virtual USDC for testing.

MCP Nautobot Server

MCP Nautobot Server

A Model Context Protocol server that integrates with Nautobot to provide network automation and infrastructure data to AI assistants like Claude, allowing them to query and interact with network Source of Truth systems.

Promokit Prestashop MCP Server

Promokit Prestashop MCP Server

A Model Context Protocol server designed to integrate with Claude Desktop, allowing users to interact with Prestashop e-commerce platforms through natural language interfaces.

Jira MCP Server

Jira MCP Server

A simple MCP server that provides access to Jira issues from Cursor AI, allowing users to reference and query Jira tickets directly in the chat panel.

apisetu-mcp-server

apisetu-mcp-server

Servidor MCP do API Setu

Octocode MCP

Octocode MCP

AI-powered code assistant that provides advanced search and discovery capabilities across GitHub and NPM ecosystems, helping users understand code patterns, implementations, and connections between repositories.

Sloot MCP Server

Sloot MCP Server

A TypeScript MCP server implementation using Express.js that provides basic tools like echo, time retrieval, and calculator functionality. Features session management, RESTful API endpoints, and Server-Sent Events for streamable communication.

Google Calendar MCP Server

Google Calendar MCP Server

XMCP

XMCP

A comprehensive MCP server for X/Twitter featuring over 70 tools for research, engagement, and publishing with granular permission-based access control. It includes specialized Playwright-powered tools for fetching X articles and supports extensive account management and thread operations.

zotero-assistant-mcp

zotero-assistant-mcp

A Zotero library management MCP server designed for Cloudflare Workers that enables searching, reading, and writing library items. It allows users to manage metadata, full-text content, and attachments through natural language interactions.

Cortex Context MCP Adapter

Cortex Context MCP Adapter

Enables integration with Cortex Context services through the Model Context Protocol. Provides authenticated access to CortexGuardAI's context management capabilities for registered users.

SmartconversionAPI

SmartconversionAPI

conversion image to webp/avif with 402 payement required, 0.001 and 0.0005 after 1000 requete.

Remote MCP Server (Authless)

Remote MCP Server (Authless)

A template for deploying an authentication-free MCP server on Cloudflare Workers. Allows users to create and customize remote MCP tools accessible from Claude Desktop or AI playgrounds via SSE endpoint.

BloodHound MCP Server

BloodHound MCP Server

Enables security professionals to query and analyze Active Directory attack paths from BloodHound Community Edition data using natural language through Claude Desktop's Model Context Protocol interface.

MCP Todo List Manager

MCP Todo List Manager

Enables natural language todo list management through Claude Desktop with YAML-based persistence. Supports creating, completing, deleting, and listing todo items with automatic timestamp tracking and secure file permissions.

K8s MCP Server

K8s MCP Server

K8s-mcp-server é um servidor Model Context Protocol (MCP) que permite que assistentes de IA, como o Claude, executem comandos do Kubernetes de forma segura. Ele fornece uma ponte entre modelos de linguagem e ferramentas essenciais da CLI do Kubernetes, incluindo kubectl, helm, istioctl e argocd, permitindo que sistemas de IA auxiliem no gerenciamento de clusters, solução de problemas e implantações.

qmcp

qmcp

An MCP server that enables AI assistants to interact with q/kdb+ databases for development and debugging workflows. It supports executing queries, persistent connection management, and includes a Qython translator for converting Python-like syntax to q.

OSRS-STAT

OSRS-STAT

A Model Context Protocol (MCP) server that provides real-time player statistics and ranking data of 'Old School RuneScape', supporting multiple game modes and player comparison functions.

Agent Interviews

Agent Interviews

Agent Interviews

Semantic Scholar MCP Server

Semantic Scholar MCP Server

Semantic Scholar API, providing comprehensive access to academic paper data, author information, and citation networks.

Remote MCP Server (Authless)

Remote MCP Server (Authless)

A template for deploying MCP servers without authentication on Cloudflare Workers. Enables custom tool creation and integration with Claude Desktop and AI Playground through Server-Sent Events.

MCP demo (DeepSeek as Client's LLM)

MCP demo (DeepSeek as Client's LLM)

Okay, here's how you can run a minimal client-server demo using the MCP (Message Passing Communication) protocol with the DeepSeek API. This will be a simplified example to illustrate the basic concepts. Keep in mind that a real-world application would likely be more complex. **Important Considerations:** * **DeepSeek API Key:** You'll need a DeepSeek API key to access their models. Make sure you have one and that it's properly configured in your environment. Refer to the DeepSeek documentation for how to obtain and use your API key. * **Python:** This example uses Python. Make sure you have Python 3.6 or later installed. * **Libraries:** You'll need to install the `requests` library for making HTTP requests to the DeepSeek API. You might also want to use `Flask` for a simple server. ```bash pip install requests Flask ``` **Conceptual Overview:** 1. **Client:** The client sends a request (e.g., a text prompt) to the server. 2. **Server:** The server receives the request, calls the DeepSeek API with the prompt, gets the response from DeepSeek, and sends the response back to the client. 3. **MCP (Simplified):** In this example, we'll use HTTP as a simple form of MCP. The client sends an HTTP request, and the server sends an HTTP response. A more robust MCP implementation might use a dedicated messaging queue or other communication mechanisms. **Code Example (Python):** **1. Server (server.py):** ```python from flask import Flask, request, jsonify import requests import os app = Flask(__name__) # Replace with your DeepSeek API key (or set it as an environment variable) DEEPSEEK_API_KEY = os.environ.get("DEEPSEEK_API_KEY") # Get API key from environment variable DEEPSEEK_API_URL = "https://api.deepseek.com/v1/chat/completions" # Replace with the correct DeepSeek API endpoint @app.route('/deepseek', methods=['POST']) def deepseek_request(): try: data = request.get_json() prompt = data.get('prompt') if not prompt: return jsonify({'error': 'No prompt provided'}), 400 # Construct the DeepSeek API request headers = { "Content-Type": "application/json", "Authorization": f"Bearer {DEEPSEEK_API_KEY}" } payload = { "model": "deepseek-chat", # Or the specific model you want to use "messages": [{"role": "user", "content": prompt}], "max_tokens": 150 # Adjust as needed } # Make the request to the DeepSeek API response = requests.post(DEEPSEEK_API_URL, headers=headers, json=payload) response.raise_for_status() # Raise HTTPError for bad responses (4xx or 5xx) deepseek_data = response.json() # Extract the response from DeepSeek (adjust based on the API's response format) try: answer = deepseek_data['choices'][0]['message']['content'] except (KeyError, IndexError) as e: print(f"Error extracting content from DeepSeek response: {e}") print(f"DeepSeek Response: {deepseek_data}") return jsonify({'error': 'Error processing DeepSeek response'}), 500 return jsonify({'response': answer}) except requests.exceptions.RequestException as e: print(f"Error communicating with DeepSeek API: {e}") return jsonify({'error': f'Error communicating with DeepSeek API: {e}'}), 500 except Exception as e: print(f"An unexpected error occurred: {e}") return jsonify({'error': f'An unexpected error occurred: {e}'}), 500 if __name__ == '__main__': app.run(debug=True, port=5000) # Run the server on port 5000 ``` **2. Client (client.py):** ```python import requests import json SERVER_URL = "http://localhost:5000/deepseek" # Adjust if your server is running elsewhere def send_request(prompt): try: payload = {'prompt': prompt} headers = {'Content-Type': 'application/json'} response = requests.post(SERVER_URL, data=json.dumps(payload), headers=headers) response.raise_for_status() # Raise HTTPError for bad responses data = response.json() return data.get('response') except requests.exceptions.RequestException as e: print(f"Error connecting to the server: {e}") return None except Exception as e: print(f"An unexpected error occurred: {e}") return None if __name__ == '__main__': prompt = "What is the capital of France?" response = send_request(prompt) if response: print(f"DeepSeek's Response: {response}") else: print("Failed to get a response from the server.") ``` **How to Run:** 1. **Set your API Key:** Make sure you have set the `DEEPSEEK_API_KEY` environment variable. For example, in Linux/macOS: ```bash export DEEPSEEK_API_KEY="YOUR_DEEPSEEK_API_KEY" ``` Or in Windows: ```bash set DEEPSEEK_API_KEY=YOUR_DEEPSEEK_API_KEY ``` 2. **Start the Server:** Open a terminal and run: ```bash python server.py ``` The server should start and listen on port 5000. 3. **Run the Client:** Open another terminal and run: ```bash python client.py ``` The client will send the prompt to the server, the server will call the DeepSeek API, and the client will print the response. **Explanation:** * **Server (server.py):** * Uses Flask to create a simple web server. * The `/deepseek` route handles POST requests. * It extracts the prompt from the request body. * It constructs a request to the DeepSeek API, including your API key and the prompt. * It sends the request to the DeepSeek API using the `requests` library. * It parses the response from DeepSeek and extracts the generated text. **Important:** The exact structure of the DeepSeek API response may vary, so you'll need to adjust the code accordingly. Refer to the DeepSeek API documentation. * It sends the response back to the client as a JSON object. * Includes error handling for API request failures and other potential issues. * **Client (client.py):** * Sends a POST request to the server's `/deepseek` endpoint with the prompt in the request body. * Receives the response from the server and prints the generated text. * Includes basic error handling. **Important Notes and Improvements:** * **Error Handling:** The error handling in this example is basic. You should add more robust error handling to catch potential issues, such as network errors, API errors, and invalid responses. * **API Key Security:** Storing your API key directly in the code is not recommended for production environments. Use environment variables or a more secure method for managing your API key. The example uses `os.environ.get("DEEPSEEK_API_KEY")` which is a better practice. * **Asynchronous Communication:** For more complex applications, consider using asynchronous communication (e.g., `asyncio` in Python) to improve performance and responsiveness. * **MCP Implementation:** This example uses HTTP as a simplified form of MCP. For a more robust MCP implementation, you could use a message queue (e.g., RabbitMQ, Kafka) or a dedicated messaging library. * **DeepSeek API Documentation:** Always refer to the official DeepSeek API documentation for the most up-to-date information on API endpoints, request parameters, and response formats. * **Model Selection:** The `model` parameter in the DeepSeek API request specifies which model to use. Choose the appropriate model based on your needs. * **Token Limits:** Be aware of the token limits for the DeepSeek API. The `max_tokens` parameter controls the maximum number of tokens in the generated response. * **Rate Limiting:** The DeepSeek API may have rate limits. Implement appropriate rate limiting in your code to avoid exceeding the limits. * **Flask Debug Mode:** `debug=True` in `app.run()` is useful for development but should be disabled in production. This example provides a starting point for building a client-server application using the DeepSeek API. Remember to adapt the code to your specific requirements and to consult the DeepSeek API documentation for the most accurate information.

FluentLab Funding Assistant

FluentLab Funding Assistant

Provides access to FluentLab's funding database, enabling users to search for funding opportunities and retrieve document checklists required for specific funding programme applications.

Banxico MCP Server

Banxico MCP Server

Enables access to Bank of Mexico (Banxico) economic data including real-time and historical USD/MXN exchange rates, inflation data, interest rates, and other financial indicators. Supports querying current rates, historical data with date ranges, and economic metadata through natural language.

IAM Policy Autopilot

IAM Policy Autopilot

Analyzes application code locally to automatically generate baseline AWS IAM identity-based policies by detecting AWS SDK calls in Python, Go, and TypeScript applications. Helps AI coding assistants quickly create IAM permissions that can be refined as applications evolve.

Docker MCP Server

Docker MCP Server

Enables AI assistants like Claude to manage Docker containers, images, and Docker Compose deployments through the Model Context Protocol. Provides secure container lifecycle management, image operations, and multi-host Docker server connections.