Discover Awesome MCP Servers

Extend your agent with 26,560 capabilities via MCP servers.

All26,560
MCP MySQL Server

MCP MySQL Server

Enables AI assistants to safely query MySQL databases with read-only access by default, supporting table listing, structure inspection, and SQL queries with optional write operation control.

Kaggle NodeJS MCP Server

Kaggle NodeJS MCP Server

OSRS-STAT

OSRS-STAT

A Model Context Protocol (MCP) server that provides real-time player statistics and ranking data of 'Old School RuneScape', supporting multiple game modes and player comparison functions.

Agent Interviews

Agent Interviews

Agent Interviews

Semantic Scholar MCP Server

Semantic Scholar MCP Server

Semantic Scholar API, providing comprehensive access to academic paper data, author information, and citation networks.

Todoist Meeting MCP

Todoist Meeting MCP

Connects Claude to Todoist for transforming meeting notes into actionable tasks with inferred due dates and priorities. It enables full task lifecycle management, including creating subtasks, listing projects, and completing tasks through natural language.

MCP demo (DeepSeek as Client's LLM)

MCP demo (DeepSeek as Client's LLM)

Okay, here's how you can run a minimal client-server demo using the MCP (Message Passing Communication) protocol with the DeepSeek API. This will be a simplified example to illustrate the basic concepts. Keep in mind that a real-world application would likely be more complex. **Important Considerations:** * **DeepSeek API Key:** You'll need a DeepSeek API key to access their models. Make sure you have one and that it's properly configured in your environment. Refer to the DeepSeek documentation for how to obtain and use your API key. * **Python:** This example uses Python. Make sure you have Python 3.6 or later installed. * **Libraries:** You'll need to install the `requests` library for making HTTP requests to the DeepSeek API. You might also want to use `Flask` for a simple server. ```bash pip install requests Flask ``` **Conceptual Overview:** 1. **Client:** The client sends a request (e.g., a text prompt) to the server. 2. **Server:** The server receives the request, calls the DeepSeek API with the prompt, gets the response from DeepSeek, and sends the response back to the client. 3. **MCP (Simplified):** In this example, we'll use HTTP as a simple form of MCP. The client sends an HTTP request, and the server sends an HTTP response. A more robust MCP implementation might use a dedicated messaging queue or other communication mechanisms. **Code Example (Python):** **1. Server (server.py):** ```python from flask import Flask, request, jsonify import requests import os app = Flask(__name__) # Replace with your DeepSeek API key (or set it as an environment variable) DEEPSEEK_API_KEY = os.environ.get("DEEPSEEK_API_KEY") # Get API key from environment variable DEEPSEEK_API_URL = "https://api.deepseek.com/v1/chat/completions" # Replace with the correct DeepSeek API endpoint @app.route('/deepseek', methods=['POST']) def deepseek_request(): try: data = request.get_json() prompt = data.get('prompt') if not prompt: return jsonify({'error': 'No prompt provided'}), 400 # Construct the DeepSeek API request headers = { "Content-Type": "application/json", "Authorization": f"Bearer {DEEPSEEK_API_KEY}" } payload = { "model": "deepseek-chat", # Or the specific model you want to use "messages": [{"role": "user", "content": prompt}], "max_tokens": 150 # Adjust as needed } # Make the request to the DeepSeek API response = requests.post(DEEPSEEK_API_URL, headers=headers, json=payload) response.raise_for_status() # Raise HTTPError for bad responses (4xx or 5xx) deepseek_data = response.json() # Extract the response from DeepSeek (adjust based on the API's response format) try: answer = deepseek_data['choices'][0]['message']['content'] except (KeyError, IndexError) as e: print(f"Error extracting content from DeepSeek response: {e}") print(f"DeepSeek Response: {deepseek_data}") return jsonify({'error': 'Error processing DeepSeek response'}), 500 return jsonify({'response': answer}) except requests.exceptions.RequestException as e: print(f"Error communicating with DeepSeek API: {e}") return jsonify({'error': f'Error communicating with DeepSeek API: {e}'}), 500 except Exception as e: print(f"An unexpected error occurred: {e}") return jsonify({'error': f'An unexpected error occurred: {e}'}), 500 if __name__ == '__main__': app.run(debug=True, port=5000) # Run the server on port 5000 ``` **2. Client (client.py):** ```python import requests import json SERVER_URL = "http://localhost:5000/deepseek" # Adjust if your server is running elsewhere def send_request(prompt): try: payload = {'prompt': prompt} headers = {'Content-Type': 'application/json'} response = requests.post(SERVER_URL, data=json.dumps(payload), headers=headers) response.raise_for_status() # Raise HTTPError for bad responses data = response.json() return data.get('response') except requests.exceptions.RequestException as e: print(f"Error connecting to the server: {e}") return None except Exception as e: print(f"An unexpected error occurred: {e}") return None if __name__ == '__main__': prompt = "What is the capital of France?" response = send_request(prompt) if response: print(f"DeepSeek's Response: {response}") else: print("Failed to get a response from the server.") ``` **How to Run:** 1. **Set your API Key:** Make sure you have set the `DEEPSEEK_API_KEY` environment variable. For example, in Linux/macOS: ```bash export DEEPSEEK_API_KEY="YOUR_DEEPSEEK_API_KEY" ``` Or in Windows: ```bash set DEEPSEEK_API_KEY=YOUR_DEEPSEEK_API_KEY ``` 2. **Start the Server:** Open a terminal and run: ```bash python server.py ``` The server should start and listen on port 5000. 3. **Run the Client:** Open another terminal and run: ```bash python client.py ``` The client will send the prompt to the server, the server will call the DeepSeek API, and the client will print the response. **Explanation:** * **Server (server.py):** * Uses Flask to create a simple web server. * The `/deepseek` route handles POST requests. * It extracts the prompt from the request body. * It constructs a request to the DeepSeek API, including your API key and the prompt. * It sends the request to the DeepSeek API using the `requests` library. * It parses the response from DeepSeek and extracts the generated text. **Important:** The exact structure of the DeepSeek API response may vary, so you'll need to adjust the code accordingly. Refer to the DeepSeek API documentation. * It sends the response back to the client as a JSON object. * Includes error handling for API request failures and other potential issues. * **Client (client.py):** * Sends a POST request to the server's `/deepseek` endpoint with the prompt in the request body. * Receives the response from the server and prints the generated text. * Includes basic error handling. **Important Notes and Improvements:** * **Error Handling:** The error handling in this example is basic. You should add more robust error handling to catch potential issues, such as network errors, API errors, and invalid responses. * **API Key Security:** Storing your API key directly in the code is not recommended for production environments. Use environment variables or a more secure method for managing your API key. The example uses `os.environ.get("DEEPSEEK_API_KEY")` which is a better practice. * **Asynchronous Communication:** For more complex applications, consider using asynchronous communication (e.g., `asyncio` in Python) to improve performance and responsiveness. * **MCP Implementation:** This example uses HTTP as a simplified form of MCP. For a more robust MCP implementation, you could use a message queue (e.g., RabbitMQ, Kafka) or a dedicated messaging library. * **DeepSeek API Documentation:** Always refer to the official DeepSeek API documentation for the most up-to-date information on API endpoints, request parameters, and response formats. * **Model Selection:** The `model` parameter in the DeepSeek API request specifies which model to use. Choose the appropriate model based on your needs. * **Token Limits:** Be aware of the token limits for the DeepSeek API. The `max_tokens` parameter controls the maximum number of tokens in the generated response. * **Rate Limiting:** The DeepSeek API may have rate limits. Implement appropriate rate limiting in your code to avoid exceeding the limits. * **Flask Debug Mode:** `debug=True` in `app.run()` is useful for development but should be disabled in production. This example provides a starting point for building a client-server application using the DeepSeek API. Remember to adapt the code to your specific requirements and to consult the DeepSeek API documentation for the most accurate information.

Synology Download Station MCP Server

Synology Download Station MCP Server

A Model Context Protocol server that enables AI assistants to manage downloads, search for torrents, and monitor download statistics on a Synology NAS.

Oracle HCM Cloud MCP Server by CData

Oracle HCM Cloud MCP Server by CData

Oracle HCM Cloud MCP Server by CData

heap-seance

heap-seance

MCP server that provides 8 tools for Java memory leak investigation: \- Class histograms, GC pressure snapshots, JFR recordings, heap dumps, MAT leak suspects analysis, async-profiler allocation profiles \- Structured confidence-based verdicts (none/low/medium/high) requiring independent signal corroboration \- Designed for use inside Claude Code with two slash commands

ServiceDesk Plus MCP Server

ServiceDesk Plus MCP Server

A Model Context Protocol server for integrating with ServiceDesk Plus On-Premise that provides comprehensive CMDB functionality, allowing users to manage tickets, assets, software licenses, contracts, vendors, and administrative settings through natural language.

Harvest MCP Server

Harvest MCP Server

Provides MCP integration for Harvest's time tracking, project management, and invoicing functionality, enabling natural language interaction with Harvest API through tools for managing clients, time entries, projects, tasks, and users.

Swagger to MCP

Swagger to MCP

Automatically converts Swagger/OpenAPI specifications into dynamic MCP tools, enabling interaction with any REST API through natural language by loading specs from local files or URLs.

claude-peers

claude-peers

Enables discovery and instant communication between multiple local Claude Code instances running across different projects. It allows agents to list active peers, share work summaries, and send messages through a local broker daemon.

SkyeNet-MCP-ACE

SkyeNet-MCP-ACE

Enables AI agents to execute server-side JavaScript and perform CRUD operations directly on ServiceNow instances with context bloat reduction features for efficient token usage.

mcp-shell

mcp-shell

Give hands to AI. MCP server to run shell commands securely, auditably, and on demand.

Layout Detector MCP

Layout Detector MCP

Analyzes webpage screenshots to extract precise layout information by locating image assets and calculating spatial relationships, enabling AI assistants to accurately recreate layouts with proper semantic structure using computer vision.

rxjs-mcp-server

rxjs-mcp-server

Execute, debug, and visualize RxJS streams directly from AI assistants like Claude.

Agent Progress Tracker MCP Server

Agent Progress Tracker MCP Server

Enables AI agents to track, search, and retrieve their progress across projects with persistent memory using SQLite storage and LLM-powered summarization. Supports logging completed work, searching previous entries, and retrieving context for multi-step or multi-agent workflows.

YouTube MCP Server

YouTube MCP Server

Enables YouTube content browsing, video searching, and metadata retrieval via the YouTube Data API v3. It also facilitates fetching video transcripts for summarization and analysis within MCP-compatible AI clients.

Access Context Manager API MCP Server

Access Context Manager API MCP Server

An MCP server that provides access to Google's Access Context Manager API, enabling management of service perimeters and access levels through natural language.

OneTech MCP Server

OneTech MCP Server

Enables AI assistants to extract and document Mendix Studio Pro modules by interrogating local .mpr files. Generates comprehensive JSON documentation of domain models, pages, microflows, and enumerations without sending data to the cloud.

Reddit MCP Server

Reddit MCP Server

Provides access to Reddit's API for retrieving posts, comments, user information, and search functionality. Supports multiple authentication methods and comprehensive Reddit data operations including subreddit browsing, post retrieval, and user profile access.

Data Visualization MCP Server

Data Visualization MCP Server

KDB MCP Service

KDB MCP Service

Enables AI agents to interact with KDB+ databases through standardized MCP tools, supporting full CRUD operations, schema introspection, and multi-database connections with connection pooling for efficient time-series and financial data management.

Godot MCP

Godot MCP

Provides a comprehensive integration between LLMs and the Godot Engine, enabling AI assistants to intelligently manipulate project files, scripts, and the live editor. It supports advanced workflows including version-aware documentation querying, automated E2E game testing, and real-time visual context capture.

GitPilot MCP

GitPilot MCP

A lightweight MCP server that enables AI assistants to manage local Git repositories by executing commands like status, add, and commit. It streamlines development workflows by providing repository context and diffs directly to the assistant.

go-mcp-server-mds

go-mcp-server-mds

Uma implementação em Go de um servidor Model Context Protocol (MCP) que serve arquivos Markdown com suporte a frontmatter a partir de um sistema de arquivos.

MCP Mediator

MCP Mediator

Uma implementação em Java do mediador do Protocolo de Contexto de Modelo (MCP), fornecendo integração perfeita entre clientes e servidores MCP.

Yahoo Finance MCP Server

Yahoo Finance MCP Server

Provides real-time stock quotes, historical price data, financial news, and multi-stock comparisons using Yahoo Finance data. Enables users to access comprehensive financial market information through natural language queries.