Discover Awesome MCP Servers
Extend your agent with 17,252 capabilities via MCP servers.
- All17,252
- Developer Tools3,867
- Search1,714
- Research & Data1,557
- AI Integration Systems229
- Cloud Platforms219
- Data & App Analysis181
- Database Interaction177
- Remote Shell Execution165
- Browser Automation147
- Databases145
- Communication137
- AI Content Generation127
- OS Automation120
- Programming Docs Access109
- Content Fetching108
- Note Taking97
- File Systems96
- Version Control93
- Finance91
- Knowledge & Memory90
- Monitoring79
- Security71
- Image & Video Processing69
- Digital Note Management66
- AI Memory Systems62
- Advanced AI Reasoning59
- Git Management Tools58
- Cloud Storage51
- Entertainment & Media43
- Virtualization42
- Location Services35
- Web Automation & Stealth32
- Media Content Processing32
- Calendar Management26
- Ecommerce & Retail18
- Speech Processing18
- Customer Data Platforms16
- Travel & Transportation14
- Education & Learning Tools13
- Home Automation & IoT13
- Web Search Integration12
- Health & Wellness10
- Customer Support10
- Marketing9
- Games & Gamification8
- Google Cloud Integrations7
- Art & Culture4
- Language Translation3
- Legal & Compliance2
Oracle HCM Cloud MCP Server by CData
Oracle HCM Cloud MCP Server by CData
NIX MCP Server
Enables AI-powered blockchain data queries and analysis through the Native Indexer (NIX) system. Supports querying blocks, transactions, account information, and network status across various blockchain networks.
MCP with RAG Demo
这个演示项目展示了如何实现一个具有检索增强生成 (RAG) 功能的模型上下文协议 (MCP) 服务器。该演示允许 AI 模型与知识库交互、搜索信息以及添加新文档。
Swagger to MCP
Automatically converts Swagger/OpenAPI specifications into dynamic MCP tools, enabling interaction with any REST API through natural language by loading specs from local files or URLs.
e代驾 MCP Server
A service that provides complete driver-for-hire functionality based on e代驾 open APIs, enabling users to order drivers, calculate pricing, create and track orders.
AWS Amplify Gen 2 Documentation MCP Server
This MCP server provides tools to access AWS Amplify Gen 2 documentation and search for content. (not official)
MCP Trino Server
A Model Context Protocol server that provides seamless integration with Trino and Iceberg, enabling data exploration, querying, and table maintenance through a standard interface.
BRAINS OS - version MCP
一个使用 SST、React 和 AWS 构建的 Serverless MCP 实现。
Luma Events MCP Server
Enables users to search and discover upcoming tech events, conferences, and meetups from Luma (lu.ma) through natural language queries. Built on Cloudflare Workers for fast, global access to tech event data.
MCPizza
An MCP server that allows AI assistants to order Domino's Pizza through an unofficial API, with features for store location, menu browsing, and order management.
MCP demo (DeepSeek as Client's LLM)
Okay, I can help you outline the steps to run a minimal client-server demo using the DeepSeek API, focusing on the core concepts and providing example code snippets. Since I can't directly execute code or set up environments, I'll give you the instructions and code you'll need to adapt and run yourself. **Important Considerations Before You Start:** * **DeepSeek API Key:** You'll need a valid DeepSeek API key. Obtain one from the DeepSeek AI platform. Keep it secure and don't hardcode it directly into your scripts (use environment variables or configuration files). * **Python Environment:** I'll assume you're using Python. Make sure you have Python 3.7+ installed. * **Libraries:** You'll need the `requests` library for making HTTP requests to the DeepSeek API. Install it using `pip install requests`. You might also want `Flask` or `FastAPI` for a simple server. **Conceptual Overview** 1. **Client:** The client sends a request to the server. In this case, the request will contain a prompt that you want DeepSeek to complete. 2. **Server:** The server receives the request from the client, calls the DeepSeek API with the prompt, gets the response from DeepSeek, and sends the response back to the client. 3. **DeepSeek API:** This is the external service that performs the language model inference. **Step-by-Step Instructions and Code Examples** **1. Server (using Flask)** ```python # server.py from flask import Flask, request, jsonify import requests import os app = Flask(__name__) # Replace with your actual DeepSeek API key (ideally from an environment variable) DEEPSEEK_API_KEY = os.environ.get("DEEPSEEK_API_KEY") # Get from environment DEEPSEEK_API_URL = "https://api.deepseek.com/v1/chat/completions" # Replace if different @app.route('/generate', methods=['POST']) def generate_text(): try: data = request.get_json() prompt = data.get('prompt') if not prompt: return jsonify({'error': 'Prompt is required'}), 400 headers = { 'Content-Type': 'application/json', 'Authorization': f'Bearer {DEEPSEEK_API_KEY}' } payload = { "model": "deepseek-chat", # Or another DeepSeek model "messages": [{"role": "user", "content": prompt}], "max_tokens": 200, # Adjust as needed "temperature": 0.7 # Adjust as needed } response = requests.post(DEEPSEEK_API_URL, headers=headers, json=payload) response.raise_for_status() # Raise HTTPError for bad responses (4xx or 5xx) deepseek_data = response.json() generated_text = deepseek_data['choices'][0]['message']['content'] return jsonify({'generated_text': generated_text}) except requests.exceptions.RequestException as e: print(f"API Request Error: {e}") return jsonify({'error': f'API Request Error: {e}'}), 500 except Exception as e: print(f"Server Error: {e}") return jsonify({'error': f'Server Error: {e}'}), 500 if __name__ == '__main__': app.run(debug=True, port=5000) # Or any port you prefer ``` **Explanation of `server.py`:** * **Imports:** Imports necessary libraries (Flask, requests, json, os). * **API Key:** Retrieves the DeepSeek API key from an environment variable. **Never hardcode your API key directly in the script!** * **Flask App:** Creates a Flask web application. * **`/generate` Route:** Defines a route that listens for POST requests at `/generate`. * **Request Handling:** * Extracts the `prompt` from the JSON request body. * Constructs the headers for the DeepSeek API request, including the `Authorization` header with your API key. * Creates the payload (JSON data) for the DeepSeek API request. This includes the model name, the prompt (formatted as a message), and other parameters like `max_tokens` and `temperature`. * Sends the request to the DeepSeek API using `requests.post()`. * Handles potential errors (e.g., network issues, invalid API key). * **Response Handling:** * Parses the JSON response from the DeepSeek API. * Extracts the generated text from the response. The exact structure of the response depends on the DeepSeek API. The code assumes a structure like `deepseek_data['choices'][0]['message']['content']`. **You might need to adjust this based on the actual DeepSeek API response format.** * Returns the generated text as a JSON response to the client. * **Error Handling:** Includes `try...except` blocks to catch potential errors during the API request and server processing. Returns error messages to the client. * **Running the App:** Starts the Flask development server. **2. Client (using Python)** ```python # client.py import requests import json SERVER_URL = "http://localhost:5000/generate" # Adjust if your server is running on a different address/port def generate_text(prompt): try: payload = {'prompt': prompt} headers = {'Content-Type': 'application/json'} response = requests.post(SERVER_URL, headers=headers, data=json.dumps(payload)) response.raise_for_status() # Raise HTTPError for bad responses (4xx or 5xx) data = response.json() generated_text = data.get('generated_text') return generated_text except requests.exceptions.RequestException as e: print(f"Request Error: {e}") return None except Exception as e: print(f"Error: {e}") return None if __name__ == '__main__': user_prompt = "Write a short story about a cat who goes on an adventure." generated_text = generate_text(user_prompt) if generated_text: print("Generated Text:") print(generated_text) else: print("Failed to generate text.") ``` **Explanation of `client.py`:** * **Imports:** Imports the `requests` and `json` libraries. * **`SERVER_URL`:** Defines the URL of the server's `/generate` endpoint. Make sure this matches the address and port where your server is running. * **`generate_text(prompt)` Function:** * Takes a `prompt` as input. * Constructs the payload (JSON data) to send to the server. * Sets the `Content-Type` header to `application/json`. * Sends a POST request to the server using `requests.post()`. * Handles potential errors (e.g., network issues, server not available). * Parses the JSON response from the server. * Extracts the `generated_text` from the response. * Returns the generated text. * **Main Execution Block:** * Sets a sample `user_prompt`. * Calls the `generate_text()` function to get the generated text. * Prints the generated text to the console. **3. Running the Demo** 1. **Set the API Key:** Before running anything, set the `DEEPSEEK_API_KEY` environment variable. How you do this depends on your operating system: * **Linux/macOS:** ```bash export DEEPSEEK_API_KEY="YOUR_DEEPSEEK_API_KEY" ``` * **Windows (Command Prompt):** ```cmd set DEEPSEEK_API_KEY=YOUR_DEEPSEEK_API_KEY ``` * **Windows (PowerShell):** ```powershell $env:DEEPSEEK_API_KEY="YOUR_DEEPSEEK_API_KEY" ``` **Replace `YOUR_DEEPSEEK_API_KEY` with your actual API key.** 2. **Run the Server:** Open a terminal or command prompt, navigate to the directory where you saved `server.py`, and run: ```bash python server.py ``` The Flask development server will start, and you'll see output indicating that it's running. 3. **Run the Client:** Open another terminal or command prompt, navigate to the directory where you saved `client.py`, and run: ```bash python client.py ``` The client will send a request to the server, the server will call the DeepSeek API, and the generated text will be printed to the client's console. **Important Notes and Troubleshooting** * **API Key:** Double-check that your API key is correct and that you've set the environment variable properly. An incorrect API key will result in an authentication error. * **Network Connectivity:** Make sure your server has internet access to reach the DeepSeek API. * **Error Messages:** Carefully examine any error messages you receive. They often provide clues about what's going wrong. * **DeepSeek API Response Format:** The code assumes a specific format for the DeepSeek API response. If the API changes its response format, you'll need to update the code accordingly. Refer to the DeepSeek API documentation for the correct format. * **Rate Limits:** Be aware of the DeepSeek API's rate limits. If you send too many requests in a short period, you might get rate-limited. Implement error handling and potentially retry logic to deal with rate limits. * **Security:** For production environments, use a more robust web server (like Gunicorn or uWSGI) instead of the Flask development server. Also, consider using HTTPS for secure communication between the client and server. * **Model Selection:** The code uses `"deepseek-chat"` as the model. Check the DeepSeek API documentation for other available models and their capabilities. * **Prompt Engineering:** The quality of the generated text depends heavily on the prompt you provide. Experiment with different prompts to get the best results. **Simplified Chinese Translation of Key Phrases** Here are some key phrases translated into Simplified Chinese: * **Prompt:** 提示 (tíshì) * **Generated Text:** 生成的文本 (shēngchéng de wénběn) * **API Key:** API 密钥 (API mìyào) * **Server:** 服务器 (fúwùqì) * **Client:** 客户端 (kèhùduān) * **Error:** 错误 (cuòwù) * **Request:** 请求 (qǐngqiú) * **Response:** 响应 (xiǎngyìng) * **Authentication:** 身份验证 (shēnfèn yànzhèng) * **Rate Limit:** 速率限制 (sùlǜ xiànzhì) This detailed guide should help you get started with a basic DeepSeek API client-server demo. Remember to adapt the code to your specific needs and consult the DeepSeek API documentation for the most up-to-date information. Good luck!
Apple Doc MCP
A Model Context Protocol server that provides AI coding assistants with direct access to Apple's Developer Documentation, enabling seamless lookup of frameworks, symbols, and detailed API references.
MCP OpenNutrition
Provides access to a comprehensive food database with 300,000+ items, enabling nutritional data lookups, food searches, and barcode scanning with all processing happening locally for privacy and speed.
Integrator MCP Server
A Model Context Protocol server that allows AI assistants to invoke and interact with Integrator automation workflows through an API connection.
Pollinations Multimodal MCP Server
A Model Context Protocol server that enables AI assistants like Claude to generate images, text, and audio directly through Pollinations APIs using a lightweight stdio transport design.
icalPal MCP Server
Enables AI assistants to interact with macOS Calendar and Reminders applications. Allows querying events, tasks, calendars, and accounts through natural language using the icalPal Ruby gem.
Wake County Public Library
Enables searching the Wake County Public Library catalog and all NC Cardinal libraries, returning book details including title, author, format, availability status, and direct catalog links.
Banxico MCP Server
Enables access to Bank of Mexico (Banxico) economic data including real-time and historical USD/MXN exchange rates, inflation data, interest rates, and other financial indicators. Supports querying current rates, historical data with date ranges, and economic metadata through natural language.
Agentic Tools MCP Server
A Model Context Protocol server providing AI assistants with comprehensive project, task, and subtask management capabilities with project-specific storage.
Docker MCP Server
Enables AI assistants like Claude to manage Docker containers, images, and Docker Compose deployments through the Model Context Protocol. Provides secure container lifecycle management, image operations, and multi-host Docker server connections.
doit-mcp-server
为 doit (pydoit) 提供的 MCP 服务器
MCP MySQL Server
Enables AI assistants to safely query MySQL databases with read-only access by default, supporting table listing, structure inspection, and SQL queries with optional write operation control.
Outlook Calendar MCP Server
MCP server for accessing Outlook Calendar events via API
Pokémon MCP Server
Enables interaction with live Pokémon data through PokeAPI, providing comprehensive Pokémon information, battle calculations, moveset validation, and team analysis. Supports searching Pokémon and moves, calculating stats, checking type effectiveness, and analyzing team synergies with in-memory caching for improved performance.
Minecraft Bedrock Education MCP
Enables controlling Minecraft Bedrock and Education Edition through natural language commands via WebSocket connection. Provides tools for player actions, world manipulation, building structures, camera control, and wiki integration for automated gameplay and educational scenarios.
Weather Java SSE Transport MCP Service
使用 Jetty 实现的 Java 模型上下文协议 SSE HTTP 服务器
BugcrowdMCP
A high-performance Model Context Protocol server that provides secure, tool-based access to the Bugcrowd API, allowing for natural language interaction with bug bounty programs through various AI agent platforms.
MotaWord MCP Server
This MCP gives you full control over your translation projects from start to finish. You can log in anytime to see what stage your project is in — whether it’s being translated, reviewed, or completed.
ServiceDesk Plus MCP Server
A Model Context Protocol server for integrating with ServiceDesk Plus On-Premise that provides comprehensive CMDB functionality, allowing users to manage tickets, assets, software licenses, contracts, vendors, and administrative settings through natural language.
Harvest MCP Server
Provides MCP integration for Harvest's time tracking, project management, and invoicing functionality, enabling natural language interaction with Harvest API through tools for managing clients, time entries, projects, tasks, and users.