Discover Awesome MCP Servers
Extend your agent with 26,882 capabilities via MCP servers.
- All26,882
- Developer Tools3,867
- Search1,714
- Research & Data1,557
- AI Integration Systems229
- Cloud Platforms219
- Data & App Analysis181
- Database Interaction177
- Remote Shell Execution165
- Browser Automation147
- Databases145
- Communication137
- AI Content Generation127
- OS Automation120
- Programming Docs Access109
- Content Fetching108
- Note Taking97
- File Systems96
- Version Control93
- Finance91
- Knowledge & Memory90
- Monitoring79
- Security71
- Image & Video Processing69
- Digital Note Management66
- AI Memory Systems62
- Advanced AI Reasoning59
- Git Management Tools58
- Cloud Storage51
- Entertainment & Media43
- Virtualization42
- Location Services35
- Web Automation & Stealth32
- Media Content Processing32
- Calendar Management26
- Ecommerce & Retail18
- Speech Processing18
- Customer Data Platforms16
- Travel & Transportation14
- Education & Learning Tools13
- Home Automation & IoT13
- Web Search Integration12
- Health & Wellness10
- Customer Support10
- Marketing9
- Games & Gamification8
- Google Cloud Integrations7
- Art & Culture4
- Language Translation3
- Legal & Compliance2
Time MCP Server
Provides current time information and timezone conversion capabilities using IANA timezone names and automatic system detection. It enables LLMs to fetch current times across different regions and convert specific times between timezones.
ZIP MCP Server
Provides tools for AI assistants to compress, decompress, and manage ZIP archives including metadata retrieval. It supports directory compression, password protection, and configurable extraction options.
s-GitHubTestRepo-HJA3
created from MCP server demo
MCP Server Tester
A web app to test MCP servers using an installation code from Smithery.
Open Brain MCP Server
A personal semantic knowledge base that enables storing, searching, and retrieving memories and work history using natural language. It features vector-based search, tool discovery via a registry, and indexing of Cursor agent transcripts using Supabase or Postgres.
Multi-Capability Proxy Server
A Flask-based server that hosts multiple tools, each exposing functionalities by calling external REST APIs through a unified interface.
SEQ MCP Server
Enables LLMs to query and analyze logs from SEQ structured logging server with capabilities for searching events, retrieving event details, analyzing log patterns, and accessing saved searches.
Synology Download Station MCP Server
A Model Context Protocol server that enables AI assistants to manage downloads, search for torrents, and monitor download statistics on a Synology NAS.
Oracle HCM Cloud MCP Server by CData
Oracle HCM Cloud MCP Server by CData
Vertica MCP Server
Enables AI assistants to query and explore Vertica databases through natural language with readonly protection by default. Supports SQL execution, schema discovery, large dataset streaming, and Vertica-specific optimizations like projection awareness.
orchex
Autopilot AI orchestration — auto-plan, parallelize, self-heal, and route across 6 LLMs. Describe what you want. Orchex plans, parallelizes, and executes — safely.
NIX MCP Server
Enables AI-powered blockchain data queries and analysis through the Native Indexer (NIX) system. Supports querying blocks, transactions, account information, and network status across various blockchain networks.
ServiceDesk Plus MCP Server
A Model Context Protocol server for integrating with ServiceDesk Plus On-Premise that provides comprehensive CMDB functionality, allowing users to manage tickets, assets, software licenses, contracts, vendors, and administrative settings through natural language.
Swagger to MCP
Automatically converts Swagger/OpenAPI specifications into dynamic MCP tools, enabling interaction with any REST API through natural language by loading specs from local files or URLs.
claude-peers
Enables discovery and instant communication between multiple local Claude Code instances running across different projects. It allows agents to list active peers, share work summaries, and send messages through a local broker daemon.
Lotus MCP
Enables creation of reusable browser automation skills through demonstration by recording user actions in a browser while narrating, then converting those workflows into executable skills that can be invoked through natural language.
e代驾 MCP Server
A service that provides complete driver-for-hire functionality based on e代驾 open APIs, enabling users to order drivers, calculate pricing, create and track orders.
Apple Doc MCP
A Model Context Protocol server that provides AI coding assistants with direct access to Apple's Developer Documentation, enabling seamless lookup of frameworks, symbols, and detailed API references.
Dooray MCP Server
Enables interaction with Dooray's task and calendar management system, allowing users to filter and list tasks, retrieve details, and manage task comments. It provides a set of tools for seamless integration with MCP-compatible clients like Claude Desktop and Cursor.
YouTube Transcript MCP
Enables AI models to extract transcripts from YouTube videos in multiple languages with zero local setup. It supports all YouTube URL formats and features smart caching via Cloudflare Workers for fast responses.
Tanda Workforce MCP Server
Integrates Tanda Workforce API with AI assistants to manage employee schedules, timesheets, leave requests, clock in/out operations, and workforce analytics through natural language with OAuth2 authentication.
BloodHound MCP Server
Enables security professionals to query and analyze Active Directory attack paths from BloodHound Community Edition data using natural language through Claude Desktop's Model Context Protocol interface.
Kaggle NodeJS MCP Server
MCP Todo List Manager
Enables natural language todo list management through Claude Desktop with YAML-based persistence. Supports creating, completing, deleting, and listing todo items with automatic timestamp tracking and secure file permissions.
K8s MCP Server
K8s-mcp-server é um servidor Model Context Protocol (MCP) que permite que assistentes de IA, como o Claude, executem comandos do Kubernetes de forma segura. Ele fornece uma ponte entre modelos de linguagem e ferramentas essenciais da CLI do Kubernetes, incluindo kubectl, helm, istioctl e argocd, permitindo que sistemas de IA auxiliem no gerenciamento de clusters, solução de problemas e implantações.
qmcp
An MCP server that enables AI assistants to interact with q/kdb+ databases for development and debugging workflows. It supports executing queries, persistent connection management, and includes a Qython translator for converting Python-like syntax to q.
OSRS-STAT
A Model Context Protocol (MCP) server that provides real-time player statistics and ranking data of 'Old School RuneScape', supporting multiple game modes and player comparison functions.
Agent Interviews
Agent Interviews
Semantic Scholar MCP Server
Semantic Scholar API, providing comprehensive access to academic paper data, author information, and citation networks.
MCP demo (DeepSeek as Client's LLM)
Okay, here's how you can run a minimal client-server demo using the MCP (Message Passing Communication) protocol with the DeepSeek API. This will be a simplified example to illustrate the basic concepts. Keep in mind that a real-world application would likely be more complex. **Important Considerations:** * **DeepSeek API Key:** You'll need a DeepSeek API key to access their models. Make sure you have one and that it's properly configured in your environment. Refer to the DeepSeek documentation for how to obtain and use your API key. * **Python:** This example uses Python. Make sure you have Python 3.6 or later installed. * **Libraries:** You'll need to install the `requests` library for making HTTP requests to the DeepSeek API. You might also want to use `Flask` for a simple server. ```bash pip install requests Flask ``` **Conceptual Overview:** 1. **Client:** The client sends a request (e.g., a text prompt) to the server. 2. **Server:** The server receives the request, calls the DeepSeek API with the prompt, gets the response from DeepSeek, and sends the response back to the client. 3. **MCP (Simplified):** In this example, we'll use HTTP as a simple form of MCP. The client sends an HTTP request, and the server sends an HTTP response. A more robust MCP implementation might use a dedicated messaging queue or other communication mechanisms. **Code Example (Python):** **1. Server (server.py):** ```python from flask import Flask, request, jsonify import requests import os app = Flask(__name__) # Replace with your DeepSeek API key (or set it as an environment variable) DEEPSEEK_API_KEY = os.environ.get("DEEPSEEK_API_KEY") # Get API key from environment variable DEEPSEEK_API_URL = "https://api.deepseek.com/v1/chat/completions" # Replace with the correct DeepSeek API endpoint @app.route('/deepseek', methods=['POST']) def deepseek_request(): try: data = request.get_json() prompt = data.get('prompt') if not prompt: return jsonify({'error': 'No prompt provided'}), 400 # Construct the DeepSeek API request headers = { "Content-Type": "application/json", "Authorization": f"Bearer {DEEPSEEK_API_KEY}" } payload = { "model": "deepseek-chat", # Or the specific model you want to use "messages": [{"role": "user", "content": prompt}], "max_tokens": 150 # Adjust as needed } # Make the request to the DeepSeek API response = requests.post(DEEPSEEK_API_URL, headers=headers, json=payload) response.raise_for_status() # Raise HTTPError for bad responses (4xx or 5xx) deepseek_data = response.json() # Extract the response from DeepSeek (adjust based on the API's response format) try: answer = deepseek_data['choices'][0]['message']['content'] except (KeyError, IndexError) as e: print(f"Error extracting content from DeepSeek response: {e}") print(f"DeepSeek Response: {deepseek_data}") return jsonify({'error': 'Error processing DeepSeek response'}), 500 return jsonify({'response': answer}) except requests.exceptions.RequestException as e: print(f"Error communicating with DeepSeek API: {e}") return jsonify({'error': f'Error communicating with DeepSeek API: {e}'}), 500 except Exception as e: print(f"An unexpected error occurred: {e}") return jsonify({'error': f'An unexpected error occurred: {e}'}), 500 if __name__ == '__main__': app.run(debug=True, port=5000) # Run the server on port 5000 ``` **2. Client (client.py):** ```python import requests import json SERVER_URL = "http://localhost:5000/deepseek" # Adjust if your server is running elsewhere def send_request(prompt): try: payload = {'prompt': prompt} headers = {'Content-Type': 'application/json'} response = requests.post(SERVER_URL, data=json.dumps(payload), headers=headers) response.raise_for_status() # Raise HTTPError for bad responses data = response.json() return data.get('response') except requests.exceptions.RequestException as e: print(f"Error connecting to the server: {e}") return None except Exception as e: print(f"An unexpected error occurred: {e}") return None if __name__ == '__main__': prompt = "What is the capital of France?" response = send_request(prompt) if response: print(f"DeepSeek's Response: {response}") else: print("Failed to get a response from the server.") ``` **How to Run:** 1. **Set your API Key:** Make sure you have set the `DEEPSEEK_API_KEY` environment variable. For example, in Linux/macOS: ```bash export DEEPSEEK_API_KEY="YOUR_DEEPSEEK_API_KEY" ``` Or in Windows: ```bash set DEEPSEEK_API_KEY=YOUR_DEEPSEEK_API_KEY ``` 2. **Start the Server:** Open a terminal and run: ```bash python server.py ``` The server should start and listen on port 5000. 3. **Run the Client:** Open another terminal and run: ```bash python client.py ``` The client will send the prompt to the server, the server will call the DeepSeek API, and the client will print the response. **Explanation:** * **Server (server.py):** * Uses Flask to create a simple web server. * The `/deepseek` route handles POST requests. * It extracts the prompt from the request body. * It constructs a request to the DeepSeek API, including your API key and the prompt. * It sends the request to the DeepSeek API using the `requests` library. * It parses the response from DeepSeek and extracts the generated text. **Important:** The exact structure of the DeepSeek API response may vary, so you'll need to adjust the code accordingly. Refer to the DeepSeek API documentation. * It sends the response back to the client as a JSON object. * Includes error handling for API request failures and other potential issues. * **Client (client.py):** * Sends a POST request to the server's `/deepseek` endpoint with the prompt in the request body. * Receives the response from the server and prints the generated text. * Includes basic error handling. **Important Notes and Improvements:** * **Error Handling:** The error handling in this example is basic. You should add more robust error handling to catch potential issues, such as network errors, API errors, and invalid responses. * **API Key Security:** Storing your API key directly in the code is not recommended for production environments. Use environment variables or a more secure method for managing your API key. The example uses `os.environ.get("DEEPSEEK_API_KEY")` which is a better practice. * **Asynchronous Communication:** For more complex applications, consider using asynchronous communication (e.g., `asyncio` in Python) to improve performance and responsiveness. * **MCP Implementation:** This example uses HTTP as a simplified form of MCP. For a more robust MCP implementation, you could use a message queue (e.g., RabbitMQ, Kafka) or a dedicated messaging library. * **DeepSeek API Documentation:** Always refer to the official DeepSeek API documentation for the most up-to-date information on API endpoints, request parameters, and response formats. * **Model Selection:** The `model` parameter in the DeepSeek API request specifies which model to use. Choose the appropriate model based on your needs. * **Token Limits:** Be aware of the token limits for the DeepSeek API. The `max_tokens` parameter controls the maximum number of tokens in the generated response. * **Rate Limiting:** The DeepSeek API may have rate limits. Implement appropriate rate limiting in your code to avoid exceeding the limits. * **Flask Debug Mode:** `debug=True` in `app.run()` is useful for development but should be disabled in production. This example provides a starting point for building a client-server application using the DeepSeek API. Remember to adapt the code to your specific requirements and to consult the DeepSeek API documentation for the most accurate information.