Discover Awesome MCP Servers

Extend your agent with 26,375 capabilities via MCP servers.

All26,375
OSRS-STAT

OSRS-STAT

A Model Context Protocol (MCP) server that provides real-time player statistics and ranking data of 'Old School RuneScape', supporting multiple game modes and player comparison functions.

Agent Interviews

Agent Interviews

Agent Interviews

Semantic Scholar MCP Server

Semantic Scholar MCP Server

Semantic Scholar API, providing comprehensive access to academic paper data, author information, and citation networks.

Cortex Context MCP Adapter

Cortex Context MCP Adapter

Enables integration with Cortex Context services through the Model Context Protocol. Provides authenticated access to CortexGuardAI's context management capabilities for registered users.

Remote MCP Server (Authless)

Remote MCP Server (Authless)

A template for deploying MCP servers without authentication on Cloudflare Workers. Enables custom tool creation and integration with Claude Desktop and AI Playground through Server-Sent Events.

MCP demo (DeepSeek as Client's LLM)

MCP demo (DeepSeek as Client's LLM)

Okay, I can help you outline the steps to run a minimal client-server demo using the DeepSeek API, focusing on the core concepts and providing example code snippets. Since I can't directly execute code or set up environments, I'll give you the instructions and code you'll need to adapt and run yourself. **Important Considerations Before You Start:** * **DeepSeek API Key:** You'll need a valid DeepSeek API key. Obtain one from the DeepSeek AI platform. Keep it secure and don't hardcode it directly into your scripts (use environment variables or configuration files). * **Python Environment:** I'll assume you're using Python. Make sure you have Python 3.7+ installed. * **Libraries:** You'll need the `requests` library for making HTTP requests to the DeepSeek API. Install it using `pip install requests`. You might also want `Flask` or `FastAPI` for a simple server. **Conceptual Overview** 1. **Client:** The client sends a request to the server. In this case, the request will contain a prompt that you want DeepSeek to complete. 2. **Server:** The server receives the request from the client, calls the DeepSeek API with the prompt, gets the response from DeepSeek, and sends the response back to the client. 3. **DeepSeek API:** This is the external service that performs the language model inference. **Step-by-Step Instructions and Code Examples** **1. Server (using Flask)** ```python # server.py from flask import Flask, request, jsonify import requests import os app = Flask(__name__) # Replace with your actual DeepSeek API key (ideally from an environment variable) DEEPSEEK_API_KEY = os.environ.get("DEEPSEEK_API_KEY") # Get from environment DEEPSEEK_API_URL = "https://api.deepseek.com/v1/chat/completions" # Replace if different @app.route('/generate', methods=['POST']) def generate_text(): try: data = request.get_json() prompt = data.get('prompt') if not prompt: return jsonify({'error': 'Prompt is required'}), 400 headers = { 'Content-Type': 'application/json', 'Authorization': f'Bearer {DEEPSEEK_API_KEY}' } payload = { "model": "deepseek-chat", # Or another DeepSeek model "messages": [{"role": "user", "content": prompt}], "max_tokens": 200, # Adjust as needed "temperature": 0.7 # Adjust as needed } response = requests.post(DEEPSEEK_API_URL, headers=headers, json=payload) response.raise_for_status() # Raise HTTPError for bad responses (4xx or 5xx) deepseek_data = response.json() generated_text = deepseek_data['choices'][0]['message']['content'] return jsonify({'generated_text': generated_text}) except requests.exceptions.RequestException as e: print(f"API Request Error: {e}") return jsonify({'error': f'API Request Error: {e}'}), 500 except Exception as e: print(f"Server Error: {e}") return jsonify({'error': f'Server Error: {e}'}), 500 if __name__ == '__main__': app.run(debug=True, port=5000) # Or any port you prefer ``` **Explanation of `server.py`:** * **Imports:** Imports necessary libraries (Flask, requests, json, os). * **API Key:** Retrieves the DeepSeek API key from an environment variable. **Never hardcode your API key directly in the script!** * **Flask App:** Creates a Flask web application. * **`/generate` Route:** Defines a route that listens for POST requests at `/generate`. * **Request Handling:** * Extracts the `prompt` from the JSON request body. * Constructs the headers for the DeepSeek API request, including the `Authorization` header with your API key. * Creates the payload (JSON data) for the DeepSeek API request. This includes the model name, the prompt (formatted as a message), and other parameters like `max_tokens` and `temperature`. * Sends the request to the DeepSeek API using `requests.post()`. * Handles potential errors (e.g., network issues, invalid API key). * **Response Handling:** * Parses the JSON response from the DeepSeek API. * Extracts the generated text from the response. The exact structure of the response depends on the DeepSeek API. The code assumes a structure like `deepseek_data['choices'][0]['message']['content']`. **You might need to adjust this based on the actual DeepSeek API response format.** * Returns the generated text as a JSON response to the client. * **Error Handling:** Includes `try...except` blocks to catch potential errors during the API request and server processing. Returns error messages to the client. * **Running the App:** Starts the Flask development server. **2. Client (using Python)** ```python # client.py import requests import json SERVER_URL = "http://localhost:5000/generate" # Adjust if your server is running on a different address/port def generate_text(prompt): try: payload = {'prompt': prompt} headers = {'Content-Type': 'application/json'} response = requests.post(SERVER_URL, headers=headers, data=json.dumps(payload)) response.raise_for_status() # Raise HTTPError for bad responses (4xx or 5xx) data = response.json() generated_text = data.get('generated_text') return generated_text except requests.exceptions.RequestException as e: print(f"Request Error: {e}") return None except Exception as e: print(f"Error: {e}") return None if __name__ == '__main__': user_prompt = "Write a short story about a cat who goes on an adventure." generated_text = generate_text(user_prompt) if generated_text: print("Generated Text:") print(generated_text) else: print("Failed to generate text.") ``` **Explanation of `client.py`:** * **Imports:** Imports the `requests` and `json` libraries. * **`SERVER_URL`:** Defines the URL of the server's `/generate` endpoint. Make sure this matches the address and port where your server is running. * **`generate_text(prompt)` Function:** * Takes a `prompt` as input. * Constructs the payload (JSON data) to send to the server. * Sets the `Content-Type` header to `application/json`. * Sends a POST request to the server using `requests.post()`. * Handles potential errors (e.g., network issues, server not available). * Parses the JSON response from the server. * Extracts the `generated_text` from the response. * Returns the generated text. * **Main Execution Block:** * Sets a sample `user_prompt`. * Calls the `generate_text()` function to get the generated text. * Prints the generated text to the console. **3. Running the Demo** 1. **Set the API Key:** Before running anything, set the `DEEPSEEK_API_KEY` environment variable. How you do this depends on your operating system: * **Linux/macOS:** ```bash export DEEPSEEK_API_KEY="YOUR_DEEPSEEK_API_KEY" ``` * **Windows (Command Prompt):** ```cmd set DEEPSEEK_API_KEY=YOUR_DEEPSEEK_API_KEY ``` * **Windows (PowerShell):** ```powershell $env:DEEPSEEK_API_KEY="YOUR_DEEPSEEK_API_KEY" ``` **Replace `YOUR_DEEPSEEK_API_KEY` with your actual API key.** 2. **Run the Server:** Open a terminal or command prompt, navigate to the directory where you saved `server.py`, and run: ```bash python server.py ``` The Flask development server will start, and you'll see output indicating that it's running. 3. **Run the Client:** Open another terminal or command prompt, navigate to the directory where you saved `client.py`, and run: ```bash python client.py ``` The client will send a request to the server, the server will call the DeepSeek API, and the generated text will be printed to the client's console. **Important Notes and Troubleshooting** * **API Key:** Double-check that your API key is correct and that you've set the environment variable properly. An incorrect API key will result in an authentication error. * **Network Connectivity:** Make sure your server has internet access to reach the DeepSeek API. * **Error Messages:** Carefully examine any error messages you receive. They often provide clues about what's going wrong. * **DeepSeek API Response Format:** The code assumes a specific format for the DeepSeek API response. If the API changes its response format, you'll need to update the code accordingly. Refer to the DeepSeek API documentation for the correct format. * **Rate Limits:** Be aware of the DeepSeek API's rate limits. If you send too many requests in a short period, you might get rate-limited. Implement error handling and potentially retry logic to deal with rate limits. * **Security:** For production environments, use a more robust web server (like Gunicorn or uWSGI) instead of the Flask development server. Also, consider using HTTPS for secure communication between the client and server. * **Model Selection:** The code uses `"deepseek-chat"` as the model. Check the DeepSeek API documentation for other available models and their capabilities. * **Prompt Engineering:** The quality of the generated text depends heavily on the prompt you provide. Experiment with different prompts to get the best results. **Simplified Chinese Translation of Key Phrases** Here are some key phrases translated into Simplified Chinese: * **Prompt:** 提示 (tíshì) * **Generated Text:** 生成的文本 (shēngchéng de wénběn) * **API Key:** API 密钥 (API mìyào) * **Server:** 服务器 (fúwùqì) * **Client:** 客户端 (kèhùduān) * **Error:** 错误 (cuòwù) * **Request:** 请求 (qǐngqiú) * **Response:** 响应 (xiǎngyìng) * **Authentication:** 身份验证 (shēnfèn yànzhèng) * **Rate Limit:** 速率限制 (sùlǜ xiànzhì) This detailed guide should help you get started with a basic DeepSeek API client-server demo. Remember to adapt the code to your specific needs and consult the DeepSeek API documentation for the most up-to-date information. Good luck!

Apple Doc MCP

Apple Doc MCP

A Model Context Protocol server that provides AI coding assistants with direct access to Apple's Developer Documentation, enabling seamless lookup of frameworks, symbols, and detailed API references.

Sequential Thinking Tool API

Sequential Thinking Tool API

A Node.js/TypeScript backend for managing sequential thinking sessions, allowing users to create sessions and post thoughts in a structured sequence with support for real-time updates via Server-Sent Events.

MCP Documentation Server

MCP Documentation Server

Enables semantic search and retrieval of MCP (Model Context Protocol) documentation using Redis-backed embeddings, allowing users to query and access documentation content through natural language.

Voice MCP

Voice MCP

Enables voice interaction with Claude Code through local speech-to-text (Whisper) and text-to-speech (Supertonic), allowing verbal input/output without external API calls.

Remote MCP Server

Remote MCP Server

A Cloudflare Workers-based MCP server that enables tool integration with Claude AI through OAuth login, allowing users to extend Claude's capabilities with custom tools like mathematical operations.

mcp-colombia

mcp-colombia

This MCP server connects AI agents with Colombian e-commerce, travel, and financial services, allowing users to search MercadoLibre, find hotels, and compare banking products like CDTs and loans. It enables seamless integration with local services in pesos colombianos through specialized tools for shopping, travel planning, and financial simulation.

Todoist Meeting MCP

Todoist Meeting MCP

Connects Claude to Todoist for transforming meeting notes into actionable tasks with inferred due dates and priorities. It enables full task lifecycle management, including creating subtasks, listing projects, and completing tasks through natural language.

Trusted GMail MCP Server

Trusted GMail MCP Server

在 AWS Nitro Enclave 可信执行环境中运行的首个可信 MCP 服务器

Databento MCP

Databento MCP

A Model Context Protocol server that provides access to Databento's historical and real-time market data, including trades, OHLCV bars, and order book depth. It enables AI assistants to perform financial data analysis, manage batch jobs, and convert market data between DBN and Parquet formats.

OpenAPI REST MCP Server

OpenAPI REST MCP Server

Dynamically converts any REST service's OpenAPI specification into MCP tools, enabling interaction with REST endpoints through natural language. Supports Spring Boot services and includes auto-discovery for common API configurations.

mcp-victorialogs

mcp-victorialogs

mcp-victorialogs

MCP Password Generator

MCP Password Generator

Generates secure random passwords and memorable passphrases with customizable options including length, character types, emojis, and automatic strength evaluation using zxcvbn scoring.

Mozaic MCP Server

Mozaic MCP Server

Provides AI assistants access to the ADEO Mozaic Design System, enabling lookups of design tokens, component documentation, icons, and CSS utilities, plus generation of Vue and React component code snippets.

Access Context Manager API MCP Server

Access Context Manager API MCP Server

An MCP server that provides access to Google's Access Context Manager API, enabling management of service perimeters and access levels through natural language.

OneTech MCP Server

OneTech MCP Server

Enables AI assistants to extract and document Mendix Studio Pro modules by interrogating local .mpr files. Generates comprehensive JSON documentation of domain models, pages, microflows, and enumerations without sending data to the cloud.

Reddit MCP Server

Reddit MCP Server

Provides access to Reddit's API for retrieving posts, comments, user information, and search functionality. Supports multiple authentication methods and comprehensive Reddit data operations including subreddit browsing, post retrieval, and user profile access.

Godot MCP

Godot MCP

Provides a comprehensive integration between LLMs and the Godot Engine, enabling AI assistants to intelligently manipulate project files, scripts, and the live editor. It supports advanced workflows including version-aware documentation querying, automated E2E game testing, and real-time visual context capture.

GitPilot MCP

GitPilot MCP

A lightweight MCP server that enables AI assistants to manage local Git repositories by executing commands like status, add, and commit. It streamlines development workflows by providing repository context and diffs directly to the assistant.

Futurama Quote Machine MCP Server

Futurama Quote Machine MCP Server

Enables interaction with Futurama quotes through Claude Desktop by connecting to the Futurama Quote Machine API. Supports getting random quotes, searching by character, adding new quotes, editing existing ones, and managing the quote collection through natural language.

Alayman MCP Server

Alayman MCP Server

Enables access to articles from alayman.io, allowing users to fetch, search, and filter technical content through natural language. It supports pagination and keyword-based filtering for specific topics like React, Angular, and TypeScript.

go-mcp-server-mds

go-mcp-server-mds

一个 Go 语言实现的模型上下文协议 (MCP) 服务器,用于从文件系统中提供带有 frontmatter 支持的 Markdown 文件。

MCP Mediator

MCP Mediator

一个基于 Java 的模型上下文协议 (MCP) 中介器实现,提供 MCP 客户端和服务器之间的无缝集成。

Yahoo Finance MCP Server

Yahoo Finance MCP Server

Provides real-time stock quotes, historical price data, financial news, and multi-stock comparisons using Yahoo Finance data. Enables users to access comprehensive financial market information through natural language queries.

Hong Kong Creative Goods Trade MCP Server

Hong Kong Creative Goods Trade MCP Server

Provides access to Hong Kong's recreation, sports, and cultural data through a FastMCP interface, allowing users to retrieve statistics on creative goods trade including domestic exports, re-exports, and imports with optional year filtering.