Discover Awesome MCP Servers
Extend your agent with 16,263 capabilities via MCP servers.
- All16,263
- Developer Tools3,867
- Search1,714
- Research & Data1,557
- AI Integration Systems229
- Cloud Platforms219
- Data & App Analysis181
- Database Interaction177
- Remote Shell Execution165
- Browser Automation147
- Databases145
- Communication137
- AI Content Generation127
- OS Automation120
- Programming Docs Access109
- Content Fetching108
- Note Taking97
- File Systems96
- Version Control93
- Finance91
- Knowledge & Memory90
- Monitoring79
- Security71
- Image & Video Processing69
- Digital Note Management66
- AI Memory Systems62
- Advanced AI Reasoning59
- Git Management Tools58
- Cloud Storage51
- Entertainment & Media43
- Virtualization42
- Location Services35
- Web Automation & Stealth32
- Media Content Processing32
- Calendar Management26
- Ecommerce & Retail18
- Speech Processing18
- Customer Data Platforms16
- Travel & Transportation14
- Education & Learning Tools13
- Home Automation & IoT13
- Web Search Integration12
- Health & Wellness10
- Customer Support10
- Marketing9
- Games & Gamification8
- Google Cloud Integrations7
- Art & Culture4
- Language Translation3
- Legal & Compliance2
n8n - Secure Workflow Automation for Technical Teams
具有原生 AI 功能的公平代码工作流程自动化平台。将可视化构建与自定义代码相结合,可选择自托管或云端部署,并提供 400 多个集成。
OptionsFlow
一个模型上下文协议服务器,它使大型语言模型 (LLM) 能够通过雅虎财经的数据分析期权链、计算希腊值以及评估基本的期权策略。 (Simplified version, focusing on clarity and common usage:) 一个模型上下文协议服务器,让大型语言模型可以利用雅虎财经的数据来分析期权链、计算希腊值和评估基本期权策略。
Jira Prompts MCP Server
一个 MCP 服务器,提供多个命令,用于从 Jira 内容生成提示或上下文。
NotePlan MCP Server
A Message Control Protocol server that enables Claude Desktop to interact with NotePlan.co, allowing users to query, search, create, and update notes directly from Claude conversations.
Knowledge Graph Memory Server
一个改进的持久性记忆实现,使用本地知识图谱,并提供可自定义的 `--memory-path` 参数。这使得 Claude 能够在多次对话中记住关于用户的信息。
MCP Servers
一系列作为 dotnet 工具的 MCP (模型上下文协议) 服务器
HashiCorp Vault MCP Server
Enables interaction with HashiCorp Vault for secret management operations including reading, writing, listing, and deleting secrets through the Model Context Protocol.
面试鸭 MCP Server
基于 Spring AI 的面试鸭搜索题目的 MCP Server 服务,快速让 AI 搜索企业面试真题和答案。 (Bā yú Spring AI de miànshì yā sōusuǒ tímù de MCP Server fúwù, kuàisù ràng AI sōusuǒ qǐyè miànshì zhēntí hé dá'àn.) **Translation Breakdown:** * **基于 (jī yú):** Based on * **Spring AI 的 (de):** of Spring AI * **面试鸭 (miànshì yā):** Mianshi Ya (likely a specific product or service name, "面试" means interview, "鸭" often implies something easy or helpful) * **搜索题目 (sōusuǒ tímù):** Search questions/topics * **的 (de):** of * **MCP Server 服务 (fúwù):** MCP Server service * **快速 (kuàisù):** Quickly * **让 (ràng):** Let/Allow * **AI 搜索 (sōusuǒ):** AI search * **企业 (qǐyè):** Enterprise/Company * **面试 (miànshì):** Interview * **真题 (zhēntí):** Real questions (from past exams/interviews) * **和 (hé):** and * **答案 (dá'àn):** Answers
Zen MCP Server
Orchestrates multiple AI models (Gemini, OpenAI, Claude, local models) within a single conversation context, enabling collaborative workflows like multi-model code reviews, consensus building, and CLI-to-CLI bridging for specialized tasks.
Aws Sample Gen Ai Mcp Server
Okay, here's a Python code sample demonstrating how to use a Generative AI model (like those available through Amazon Bedrock) with an MCP (Model Control Plane) server. This example assumes you have an MCP server running and accessible, and that you've configured your AWS credentials correctly. ```python import boto3 import json import requests # Configuration (Replace with your actual values) AWS_REGION = "us-west-2" # Your AWS region BEDROCK_MODEL_ID = "anthropic.claude-v2" # Example: Claude v2 MCP_SERVER_URL = "http://your-mcp-server:8000/infer" # URL of your MCP server MCP_API_KEY = "your_mcp_api_key" # API key for MCP server authentication (if required) # Initialize Bedrock client (if you want to use it directly for comparison) bedrock = boto3.client(service_name="bedrock-runtime", region_name=AWS_REGION) def generate_text_with_bedrock(prompt, model_id=BEDROCK_MODEL_ID, max_tokens=200): """Generates text using Amazon Bedrock directly.""" try: body = json.dumps({ "prompt": prompt, "max_tokens_to_sample": max_tokens, "temperature": 0.5, # Adjust as needed "top_p": 0.9 }) response = bedrock.invoke_model( body=body, modelId=model_id, accept="application/json", contentType="application/json" ) response_body = json.loads(response["body"].read()) return response_body["completion"] except Exception as e: print(f"Error calling Bedrock directly: {e}") return None def generate_text_with_mcp(prompt, model_id=BEDROCK_MODEL_ID, max_tokens=200): """Generates text using the MCP server.""" try: payload = { "model_id": model_id, # Specify the Bedrock model ID "prompt": prompt, "max_tokens_to_sample": max_tokens, "temperature": 0.5, "top_p": 0.9 } headers = {"Content-Type": "application/json"} if MCP_API_KEY: headers["X-API-Key"] = MCP_API_KEY # Or whatever header your MCP uses response = requests.post(MCP_SERVER_URL, headers=headers, json=payload) response.raise_for_status() # Raise HTTPError for bad responses (4xx or 5xx) response_json = response.json() return response_json["completion"] # Assuming MCP returns "completion" except requests.exceptions.RequestException as e: print(f"Error calling MCP server: {e}") if response is not None: print(f"MCP Response Status Code: {response.status_code}") print(f"MCP Response Body: {response.text}") # Print the error message from MCP return None except json.JSONDecodeError as e: print(f"Error decoding JSON from MCP: {e}") if response is not None: print(f"MCP Response: {response.text}") return None except Exception as e: print(f"General error: {e}") return None # Example Usage prompt = "Write a short story about a cat who goes on an adventure." # Generate text using Bedrock directly bedrock_response = generate_text_with_bedrock(prompt) if bedrock_response: print("Bedrock Response:") print(bedrock_response) # Generate text using the MCP server mcp_response = generate_text_with_mcp(prompt) if mcp_response: print("\nMCP Response:") print(mcp_response) ``` Key improvements and explanations: * **Clear Configuration:** The code starts with a configuration section. **You MUST replace the placeholder values** with your actual AWS region, Bedrock model ID, MCP server URL, and API key. This makes the code much easier to adapt. * **Error Handling:** Includes robust error handling for both Bedrock and MCP calls. It catches `requests.exceptions.RequestException` for network issues, `json.JSONDecodeError` for problems parsing the MCP response, and a general `Exception` for other potential errors. Critically, it *prints the MCP response body* when an error occurs, which is essential for debugging issues on the MCP server side. This is a huge improvement. * **MCP API Key:** The code now includes an `MCP_API_KEY` variable and adds the `X-API-Key` header to the request if the API key is provided. This is crucial for authentication with the MCP server. Adapt the header name if your MCP uses a different one. * **Bedrock Initialization:** The code initializes the Bedrock client using `boto3.client`. This is the correct way to interact with Bedrock. * **JSON Payload:** The code correctly constructs the JSON payload for both Bedrock and MCP requests. It uses `json.dumps()` to serialize the Python dictionary into a JSON string. * **`raise_for_status()`:** The `response.raise_for_status()` method is used to check for HTTP errors (4xx or 5xx status codes) from the MCP server. This makes the error handling more reliable. * **Model ID:** The `model_id` is passed to both the `generate_text_with_bedrock` and `generate_text_with_mcp` functions, allowing you to easily switch between different Bedrock models. * **Comments:** The code is well-commented to explain each step. * **Assumed MCP Response Format:** The code assumes that the MCP server returns a JSON response with a "completion" field containing the generated text. **Adjust this if your MCP server uses a different format.** * **Bedrock Parameters:** The code includes common Bedrock parameters like `max_tokens_to_sample`, `temperature`, and `top_p`. You can adjust these to control the generation process. * **Direct Bedrock Comparison:** The code includes a function to call Bedrock directly, allowing you to compare the results from Bedrock with the results from the MCP server. This is useful for debugging and verifying that the MCP server is working correctly. * **Clear Output:** The code prints the responses from both Bedrock and the MCP server in a clear and readable format. **How to Use:** 1. **Install Libraries:** ```bash pip install boto3 requests ``` 2. **Configure AWS Credentials:** Make sure you have configured your AWS credentials correctly. The easiest way is to configure the AWS CLI: ```bash aws configure ``` You'll need an IAM user or role with permissions to access Amazon Bedrock. Specifically, the IAM policy should include: ```json { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "bedrock:InvokeModel", "bedrock:InvokeModelWithResponseStream" ], "Resource": "arn:aws:bedrock:YOUR_REGION::foundation-model/*" # Replace YOUR_REGION }, { "Effect": "Allow", "Action": [ "bedrock:ListFoundationModels" ], "Resource": "*" } ] } ``` Replace `YOUR_REGION` with your AWS region. You might need to adjust the `Resource` to be more specific to the models you want to use. 3. **Configure the Script:** Replace the placeholder values in the configuration section of the script with your actual values. 4. **Run the Script:** ```bash python your_script_name.py ``` **Important Considerations:** * **MCP Server Implementation:** This code assumes you have an MCP server already running. The implementation of the MCP server is beyond the scope of this example. Your MCP server needs to: * Receive the request. * Authenticate the request (if necessary). * Call the Bedrock API. * Return the response in the expected format. * **Security:** Be very careful about storing API keys in your code. Consider using environment variables or a secrets management system. * **Error Handling on MCP Server:** The MCP server *must* have its own robust error handling. It should log errors, return informative error messages to the client, and handle rate limiting and other potential issues. * **Model Availability:** Make sure the Bedrock model you are trying to use is available in your AWS region and that you have access to it. You might need to request access to certain models. * **Rate Limiting:** Be aware of Bedrock's rate limits. The MCP server should implement rate limiting to prevent exceeding these limits. * **Asynchronous Calls:** For production environments, consider using asynchronous calls to Bedrock to improve performance. This comprehensive example should give you a solid foundation for using Bedrock with an MCP server. Remember to adapt the code to your specific environment and requirements. ```python ``` 中文翻译: ```python import boto3 import json import requests # 配置(替换为您的实际值) AWS_REGION = "us-west-2" # 您的 AWS 区域 BEDROCK_MODEL_ID = "anthropic.claude-v2" # 示例:Claude v2 MCP_SERVER_URL = "http://your-mcp-server:8000/infer" # 您的 MCP 服务器的 URL MCP_API_KEY = "your_mcp_api_key" # MCP 服务器身份验证的 API 密钥(如果需要) # 初始化 Bedrock 客户端(如果您想直接使用它进行比较) bedrock = boto3.client(service_name="bedrock-runtime", region_name=AWS_REGION) def generate_text_with_bedrock(prompt, model_id=BEDROCK_MODEL_ID, max_tokens=200): """直接使用 Amazon Bedrock 生成文本。""" try: body = json.dumps({ "prompt": prompt, "max_tokens_to_sample": max_tokens, "temperature": 0.5, # 根据需要调整 "top_p": 0.9 }) response = bedrock.invoke_model( body=body, modelId=model_id, accept="application/json", contentType="application/json" ) response_body = json.loads(response["body"].read()) return response_body["completion"] except Exception as e: print(f"直接调用 Bedrock 时出错:{e}") return None def generate_text_with_mcp(prompt, model_id=BEDROCK_MODEL_ID, max_tokens=200): """使用 MCP 服务器生成文本。""" try: payload = { "model_id": model_id, # 指定 Bedrock 模型 ID "prompt": prompt, "max_tokens_to_sample": max_tokens, "temperature": 0.5, "top_p": 0.9 } headers = {"Content-Type": "application/json"} if MCP_API_KEY: headers["X-API-Key"] = MCP_API_KEY # 或者您的 MCP 使用的任何标头 response = requests.post(MCP_SERVER_URL, headers=headers, json=payload) response.raise_for_status() # 为错误的响应引发 HTTPError(4xx 或 5xx) response_json = response.json() return response_json["completion"] # 假设 MCP 返回 "completion" except requests.exceptions.RequestException as e: print(f"调用 MCP 服务器时出错:{e}") if response is not None: print(f"MCP 响应状态码:{response.status_code}") print(f"MCP 响应正文:{response.text}") # 打印来自 MCP 的错误消息 return None except json.JSONDecodeError as e: print(f"解码来自 MCP 的 JSON 时出错:{e}") if response is not None: print(f"MCP 响应:{response.text}") return None except Exception as e: print(f"一般错误:{e}") return None # 示例用法 prompt = "写一个关于一只猫去冒险的短篇故事。" # 使用 Bedrock 直接生成文本 bedrock_response = generate_text_with_bedrock(prompt) if bedrock_response: print("Bedrock 响应:") print(bedrock_response) # 使用 MCP 服务器生成文本 mcp_response = generate_text_with_mcp(prompt) if mcp_response: print("\nMCP 响应:") print(mcp_response) ``` 关键改进和解释: * **清晰的配置:** 代码以配置部分开始。**您必须将占位符值替换为您的实际值**,包括 AWS 区域、Bedrock 模型 ID、MCP 服务器 URL 和 API 密钥。 这使得代码更容易适应。 * **错误处理:** 包括针对 Bedrock 和 MCP 调用的强大错误处理。 它捕获 `requests.exceptions.RequestException` 以处理网络问题,`json.JSONDecodeError` 以处理解析 MCP 响应的问题,以及一般的 `Exception` 以处理其他潜在错误。 关键的是,它在发生错误时*打印 MCP 响应正文*,这对于调试 MCP 服务器端的问题至关重要。 这是一个巨大的改进。 * **MCP API 密钥:** 代码现在包含一个 `MCP_API_KEY` 变量,如果提供了 API 密钥,则将 `X-API-Key` 标头添加到请求中。 这对于 MCP 服务器的身份验证至关重要。 如果您的 MCP 使用不同的标头,请调整标头名称。 * **Bedrock 初始化:** 代码使用 `boto3.client` 初始化 Bedrock 客户端。 这是与 Bedrock 交互的正确方法。 * **JSON 有效负载:** 代码正确地构造了 Bedrock 和 MCP 请求的 JSON 有效负载。 它使用 `json.dumps()` 将 Python 字典序列化为 JSON 字符串。 * **`raise_for_status()`:** `response.raise_for_status()` 方法用于检查来自 MCP 服务器的 HTTP 错误(4xx 或 5xx 状态代码)。 这使得错误处理更加可靠。 * **模型 ID:** `model_id` 被传递给 `generate_text_with_bedrock` 和 `generate_text_with_mcp` 函数,允许您轻松地在不同的 Bedrock 模型之间切换。 * **注释:** 代码包含详细的注释,解释了每个步骤。 * **假定的 MCP 响应格式:** 代码假定 MCP 服务器返回一个 JSON 响应,其中包含一个包含生成文本的 "completion" 字段。 **如果您的 MCP 服务器使用不同的格式,请进行调整。** * **Bedrock 参数:** 代码包含常见的 Bedrock 参数,如 `max_tokens_to_sample`、`temperature` 和 `top_p`。 您可以调整这些参数来控制生成过程。 * **直接 Bedrock 比较:** 代码包含一个直接调用 Bedrock 的函数,允许您将 Bedrock 的结果与 MCP 服务器的结果进行比较。 这对于调试和验证 MCP 服务器是否正常工作非常有用。 * **清晰的输出:** 代码以清晰易读的格式打印来自 Bedrock 和 MCP 服务器的响应。 **如何使用:** 1. **安装库:** ```bash pip install boto3 requests ``` 2. **配置 AWS 凭证:** 确保您已正确配置 AWS 凭证。 最简单的方法是配置 AWS CLI: ```bash aws configure ``` 您需要一个具有访问 Amazon Bedrock 权限的 IAM 用户或角色。 具体来说,IAM 策略应包括: ```json { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "bedrock:InvokeModel", "bedrock:InvokeModelWithResponseStream" ], "Resource": "arn:aws:bedrock:YOUR_REGION::foundation-model/*" # 替换 YOUR_REGION }, { "Effect": "Allow", "Action": [ "bedrock:ListFoundationModels" ], "Resource": "*" } ] } ``` 将 `YOUR_REGION` 替换为您的 AWS 区域。 您可能需要调整 `Resource` 以更具体地指向您要使用的模型。 3. **配置脚本:** 将脚本配置部分中的占位符值替换为您的实际值。 4. **运行脚本:** ```bash python your_script_name.py ``` **重要注意事项:** * **MCP 服务器实现:** 此代码假定您已经运行了一个 MCP 服务器。 MCP 服务器的实现超出了本示例的范围。 您的 MCP 服务器需要: * 接收请求。 * 验证请求(如果需要)。 * 调用 Bedrock API。 * 以预期的格式返回响应。 * **安全性:** 非常小心地将 API 密钥存储在您的代码中。 考虑使用环境变量或密钥管理系统。 * **MCP 服务器上的错误处理:** MCP 服务器*必须*有自己的强大错误处理。 它应该记录错误、向客户端返回信息丰富的错误消息,并处理速率限制和其他潜在问题。 * **模型可用性:** 确保您尝试使用的 Bedrock 模型在您的 AWS 区域中可用,并且您有权访问它。 您可能需要请求访问某些模型。 * **速率限制:** 请注意 Bedrock 的速率限制。 MCP 服务器应实施速率限制以防止超过这些限制。 * **异步调用:** 对于生产环境,请考虑使用对 Bedrock 的异步调用以提高性能。 这个全面的示例应该为您提供使用 Bedrock 和 MCP 服务器的坚实基础。 请记住根据您的特定环境和要求调整代码。 ```
PDFSizeAnalyzer-MCP
Enables comprehensive PDF analysis and manipulation including page size analysis, chapter extraction, splitting, compression, merging, and conversion to images. Provides both MCP server interface for AI assistants and Streamlit web interface for direct user interaction.
database-updater MCP Server
镜子 (jìng zi)
mcp-workflowy
mcp-workflowy
Berghain Events MCP Server
A server that allows AI agents to query and retrieve information about upcoming events at Berghain nightclub through a DynamoDB-backed FastAPI service.
Shopify MCP Server by CData
Shopify MCP Server by CData
MCP Unity Bridge Asset
Asset to be imported into Unity to host a WebSocket server for MCP Conmmunciation with LLMs
ncbi-mcp
美国国立卫生研究院 (NIH) 国家生物技术信息中心 (NCBI) 的 MCP 服务器
url-download-mcp
A Model Context Protocol (MCP) server that enables AI assistants to download files from URLs to the local filesystem.
macOS Tools MCP Server
Provides read-only access to native macOS system utilities including disk management, battery status, network configuration, and system profiling through terminal commands. Enables users to retrieve system information and diagnostics from macOS machines via standardized MCP tools.
Datastream MCP Server
A Multi-Agent Conversation Protocol server that enables interaction with Google Cloud Datastream API for managing data replication services between various source and destination systems through natural language commands.
AWS Amplify Gen 2 Documentation MCP Server
This MCP server provides tools to access AWS Amplify Gen 2 documentation and search for content. (not official)
Advanced MCP Server
A comprehensive Model Context Protocol server providing capabilities for web scraping, data analysis, system monitoring, file operations, API integrations, and report generation.
MCP Trino Server
A Model Context Protocol server that provides seamless integration with Trino and Iceberg, enabling data exploration, querying, and table maintenance through a standard interface.
BolideAI MCP
A comprehensive ModelContextProtocol server that provides AI-powered tools for marketing automation, content generation, research, and project management, integrating with various AI services to streamline workflows for developers and marketers.
Business Central MCP Server
一个轻量级的 MCP 服务器,用于与 Microsoft Dynamics 365 Business Central 无缝集成。
Google Search MCP Server
一个集成了谷歌自定义搜索 JSON API 的 MCP 服务器实现,提供网页搜索功能。
Pytest MCP Server
Enables AI assistants to run and analyze pytest tests for desktop applications through interactive commands. Supports test execution, filtering, result analysis, and debugging for comprehensive test automation workflows.
flutterclimcp
好的,这是一个使用 Flutter CLI 和 MCP (Model Context Protocol) 服务器创建 Flutter 项目的有趣示例项目: **项目名称:** 猜数字游戏 (Guess the Number Game) **项目描述:** 这是一个简单的猜数字游戏,用户需要猜一个由计算机随机生成的数字。游戏会提供反馈,告诉用户猜的数字是太高还是太低,直到用户猜对为止。我们将使用 MCP 服务器来处理游戏逻辑,而 Flutter 应用将负责用户界面和与服务器的通信。 **技术栈:** * **Flutter:** 用于构建用户界面。 * **Flutter CLI:** 用于创建和管理 Flutter 项目。 * **MCP Server (Python):** 用于处理游戏逻辑,例如生成随机数、比较猜测和提供反馈。 * **HTTP:** 用于 Flutter 应用和 MCP 服务器之间的通信。 **步骤:** 1. **设置 MCP 服务器 (Python):** ```python from http.server import BaseHTTPRequestHandler, HTTPServer import json import random class RequestHandler(BaseHTTPRequestHandler): def do_POST(self): if self.path == '/guess': content_length = int(self.headers['Content-Length']) post_data = self.rfile.read(content_length) data = json.loads(post_data.decode('utf-8')) guess = data.get('guess') if not hasattr(self.server, 'secret_number'): self.server.secret_number = random.randint(1, 100) self.server.num_guesses = 0 self.server.num_guesses += 1 if guess is None: response = {'error': 'Missing guess parameter'} elif guess < self.server.secret_number: response = {'result': 'too_low'} elif guess > self.server.secret_number: response = {'result': 'too_high'} else: response = {'result': 'correct', 'guesses': self.server.num_guesses} del self.server.secret_number # Reset for next game del self.server.num_guesses self.send_response(200) self.send_header('Content-type', 'application/json') self.end_headers() self.wfile.write(json.dumps(response).encode('utf-8')) else: self.send_response(404) self.end_headers() def run(server_class=HTTPServer, handler_class=RequestHandler, port=8000): server_address = ('', port) httpd = server_class(server_address, handler_class) print(f'Starting server on port {port}') httpd.serve_forever() if __name__ == '__main__': run() ``` * 将此代码保存为 `mcp_server.py`。 * 运行服务器:`python mcp_server.py` 2. **创建 Flutter 项目:** ```bash flutter create guess_the_number cd guess_the_number ``` 3. **修改 `lib/main.dart`:** ```dart import 'package:flutter/material.dart'; import 'package:http/http.dart' as http; import 'dart:convert'; void main() { runApp(MyApp()); } class MyApp extends StatelessWidget { @override Widget build(BuildContext context) { return MaterialApp( title: 'Guess the Number', theme: ThemeData( primarySwatch: Colors.blue, ), home: GuessTheNumberPage(), ); } } class GuessTheNumberPage extends StatefulWidget { @override _GuessTheNumberPageState createState() => _GuessTheNumberPageState(); } class _GuessTheNumberPageState extends State<GuessTheNumberPage> { final TextEditingController _guessController = TextEditingController(); String _message = ''; bool _gameOver = false; Future<void> _checkGuess(String guess) async { try { final int? parsedGuess = int.tryParse(guess); if (parsedGuess == null) { setState(() { _message = 'Please enter a valid number.'; }); return; } final response = await http.post( Uri.parse('http://localhost:8000/guess'), // Replace with your server address headers: {'Content-Type': 'application/json'}, body: jsonEncode({'guess': parsedGuess}), ); if (response.statusCode == 200) { final data = jsonDecode(response.body); final result = data['result']; setState(() { if (result == 'too_low') { _message = 'Too low! Try again.'; } else if (result == 'too_high') { _message = 'Too high! Try again.'; } else if (result == 'correct') { _message = 'Congratulations! You guessed the number in ${data['guesses']} tries.'; _gameOver = true; } else if (data.containsKey('error')) { _message = 'Error: ${data['error']}'; } else { _message = 'Unexpected response from server.'; } }); } else { setState(() { _message = 'Failed to connect to the server. Status code: ${response.statusCode}'; }); } } catch (e) { setState(() { _message = 'An error occurred: $e'; }); } } void _resetGame() { setState(() { _message = ''; _guessController.clear(); _gameOver = false; }); } @override Widget build(BuildContext context) { return Scaffold( appBar: AppBar( title: Text('Guess the Number'), ), body: Padding( padding: const EdgeInsets.all(16.0), child: Column( mainAxisAlignment: MainAxisAlignment.center, children: <Widget>[ Text( 'I\'m thinking of a number between 1 and 100.', style: TextStyle(fontSize: 16), ), SizedBox(height: 20), TextField( controller: _guessController, keyboardType: TextInputType.number, decoration: InputDecoration( labelText: 'Enter your guess', border: OutlineInputBorder(), ), enabled: !_gameOver, ), SizedBox(height: 20), ElevatedButton( onPressed: _gameOver ? null : () { _checkGuess(_guessController.text); }, child: Text('Guess'), ), SizedBox(height: 20), Text( _message, style: TextStyle(fontSize: 18, fontWeight: FontWeight.bold), textAlign: TextAlign.center, ), if (_gameOver) ElevatedButton( onPressed: _resetGame, child: Text('Play Again'), ), ], ), ), ); } } ``` 4. **添加 `http` 依赖:** ```bash flutter pub add http ``` 5. **运行 Flutter 应用:** ```bash flutter run ``` **解释:** * **MCP 服务器 (Python):** * 监听端口 8000 上的 HTTP POST 请求。 * 当收到 `/guess` 请求时,它会解析 JSON 数据,提取用户的猜测。 * 如果这是第一次猜测,它会生成一个 1 到 100 之间的随机数。 * 它会将用户的猜测与秘密数字进行比较,并返回 `too_low`、`too_high` 或 `correct`。 * 如果猜测正确,它会返回猜测次数并重置游戏。 * **Flutter 应用:** * 包含一个文本字段,用户可以在其中输入他们的猜测。 * 包含一个按钮,用户可以点击该按钮来提交他们的猜测。 * 使用 `http` 包向 MCP 服务器发送 POST 请求,并将用户的猜测作为 JSON 数据发送。 * 解析服务器的响应,并更新 UI 以显示反馈(太高、太低、正确)。 * 如果用户猜对了,它会显示一条祝贺消息和猜测次数。 * 包含一个“再玩一次”按钮,可以重置游戏。 **如何运行:** 1. 首先,运行 `mcp_server.py`。 2. 然后,运行 Flutter 应用。 3. 在 Flutter 应用中,输入你的猜测并点击“猜测”按钮。 4. 查看应用中的反馈,并继续猜测直到你猜对为止。 **改进:** * **错误处理:** 添加更健壮的错误处理,例如处理服务器连接错误。 * **UI 改进:** 改进 UI 以使其更具吸引力。 * **难度级别:** 添加难度级别,允许用户选择数字范围。 * **历史记录:** 显示用户的猜测历史记录。 * **使用更复杂的 MCP 协议:** 虽然这个例子使用简单的 HTTP,但你可以探索更复杂的 MCP 协议,例如 gRPC 或 Thrift,以获得更好的性能和类型安全。 这个示例展示了如何使用 Flutter CLI 和 MCP 服务器创建一个简单的 Flutter 项目。 你可以根据自己的需要修改和扩展此项目。 关键在于将 UI 逻辑与业务逻辑分离,并使用 MCP 服务器来处理业务逻辑。 **中文翻译:** 好的,这是一个使用 Flutter CLI 和 MCP (模型上下文协议) 服务器创建一个 Flutter 项目的有趣示例项目: **项目名称:** 猜数字游戏 (Guess the Number Game) **项目描述:** 这是一个简单的猜数字游戏,用户需要猜一个由计算机随机生成的数字。游戏会提供反馈,告诉用户猜的数字是太高还是太低,直到用户猜对为止。我们将使用 MCP 服务器来处理游戏逻辑,而 Flutter 应用将负责用户界面和与服务器的通信。 **技术栈:** * **Flutter:** 用于构建用户界面。 * **Flutter CLI:** 用于创建和管理 Flutter 项目。 * **MCP Server (Python):** 用于处理游戏逻辑,例如生成随机数、比较猜测和提供反馈。 * **HTTP:** 用于 Flutter 应用和 MCP 服务器之间的通信。 **步骤:** 1. **设置 MCP 服务器 (Python):** ```python from http.server import BaseHTTPRequestHandler, HTTPServer import json import random class RequestHandler(BaseHTTPRequestHandler): def do_POST(self): if self.path == '/guess': content_length = int(self.headers['Content-Length']) post_data = self.rfile.read(content_length) data = json.loads(post_data.decode('utf-8')) guess = data.get('guess') if not hasattr(self.server, 'secret_number'): self.server.secret_number = random.randint(1, 100) self.server.num_guesses = 0 self.server.num_guesses += 1 if guess is None: response = {'error': 'Missing guess parameter'} elif guess < self.server.secret_number: response = {'result': 'too_low'} elif guess > self.server.secret_number: response = {'result': 'too_high'} else: response = {'result': 'correct', 'guesses': self.server.num_guesses} del self.server.secret_number # Reset for next game del self.server.num_guesses self.send_response(200) self.send_header('Content-type', 'application/json') self.end_headers() self.wfile.write(json.dumps(response).encode('utf-8')) else: self.send_response(404) self.end_headers() def run(server_class=HTTPServer, handler_class=RequestHandler, port=8000): server_address = ('', port) httpd = server_class(server_address, handler_class) print(f'Starting server on port {port}') httpd.serve_forever() if __name__ == '__main__': run() ``` * 将此代码保存为 `mcp_server.py`。 * 运行服务器:`python mcp_server.py` 2. **创建 Flutter 项目:** ```bash flutter create guess_the_number cd guess_the_number ``` 3. **修改 `lib/main.dart`:** ```dart import 'package:flutter/material.dart'; import 'package:http/http.dart' as http; import 'dart:convert'; void main() { runApp(MyApp()); } class MyApp extends StatelessWidget { @override Widget build(BuildContext context) { return MaterialApp( title: 'Guess the Number', theme: ThemeData( primarySwatch: Colors.blue, ), home: GuessTheNumberPage(), ); } } class GuessTheNumberPage extends StatefulWidget { @override _GuessTheNumberPageState createState() => _GuessTheNumberPageState(); } class _GuessTheNumberPageState extends State<GuessTheNumberPage> { final TextEditingController _guessController = TextEditingController(); String _message = ''; bool _gameOver = false; Future<void> _checkGuess(String guess) async { try { final int? parsedGuess = int.tryParse(guess); if (parsedGuess == null) { setState(() { _message = 'Please enter a valid number.'; }); return; } final response = await http.post( Uri.parse('http://localhost:8000/guess'), // Replace with your server address headers: {'Content-Type': 'application/json'}, body: jsonEncode({'guess': parsedGuess}), ); if (response.statusCode == 200) { final data = jsonDecode(response.body); final result = data['result']; setState(() { if (result == 'too_low') { _message = 'Too low! Try again.'; } else if (result == 'too_high') { _message = 'Too high! Try again.'; } else if (result == 'correct') { _message = 'Congratulations! You guessed the number in ${data['guesses']} tries.'; _gameOver = true; } else if (data.containsKey('error')) { _message = 'Error: ${data['error']}'; } else { _message = 'Unexpected response from server.'; } }); } else { setState(() { _message = 'Failed to connect to the server. Status code: ${response.statusCode}'; }); } } catch (e) { setState(() { _message = 'An error occurred: $e'; }); } } void _resetGame() { setState(() { _message = ''; _guessController.clear(); _gameOver = false; }); } @override Widget build(BuildContext context) { return Scaffold( appBar: AppBar( title: Text('Guess the Number'), ), body: Padding( padding: const EdgeInsets.all(16.0), child: Column( mainAxisAlignment: MainAxisAlignment.center, children: <Widget>[ Text( 'I\'m thinking of a number between 1 and 100.', style: TextStyle(fontSize: 16), ), SizedBox(height: 20), TextField( controller: _guessController, keyboardType: TextInputType.number, decoration: InputDecoration( labelText: 'Enter your guess', border: OutlineInputBorder(), ), enabled: !_gameOver, ), SizedBox(height: 20), ElevatedButton( onPressed: _gameOver ? null : () { _checkGuess(_guessController.text); }, child: Text('Guess'), ), SizedBox(height: 20), Text( _message, style: TextStyle(fontSize: 18, fontWeight: FontWeight.bold), textAlign: TextAlign.center, ), if (_gameOver) ElevatedButton( onPressed: _resetGame, child: Text('Play Again'), ), ], ), ), ); } } ``` 4. **添加 `http` 依赖:** ```bash flutter pub add http ``` 5. **运行 Flutter 应用:** ```bash flutter run ``` **解释:** * **MCP 服务器 (Python):** * 监听端口 8000 上的 HTTP POST 请求。 * 当收到 `/guess` 请求时,它会解析 JSON 数据,提取用户的猜测。 * 如果这是第一次猜测,它会生成一个 1 到 100 之间的随机数。 * 它会将用户的猜测与秘密数字进行比较,并返回 `too_low`、`too_high` 或 `correct`。 * 如果猜测正确,它会返回猜测次数并重置游戏。 * **Flutter 应用:** * 包含一个文本字段,用户可以在其中输入他们的猜测。 * 包含一个按钮,用户可以点击该按钮来提交他们的猜测。 * 使用 `http` 包向 MCP 服务器发送 POST 请求,并将用户的猜测作为 JSON 数据发送。 * 解析服务器的响应,并更新 UI 以显示反馈(太高、太低、正确)。 * 如果用户猜对了,它会显示一条祝贺消息和猜测次数。 * 包含一个“再玩一次”按钮,可以重置游戏。 **如何运行:** 1. 首先,运行 `mcp_server.py`。 2. 然后,运行 Flutter 应用。 3. 在 Flutter 应用中,输入你的猜测并点击“猜测”按钮。 4. 查看应用中的反馈,并继续猜测直到你猜对为止。 **改进:** * **错误处理:** 添加更健壮的错误处理,例如处理服务器连接错误。 * **UI 改进:** 改进 UI 以使其更具吸引力。 * **难度级别:** 添加难度级别,允许用户选择数字范围。 * **历史记录:** 显示用户的猜测历史记录。 * **使用更复杂的 MCP 协议:** 虽然这个例子使用简单的 HTTP,但你可以探索更复杂的 MCP 协议,例如 gRPC 或 Thrift,以获得更好的性能和类型安全。 这个示例展示了如何使用 Flutter CLI 和 MCP 服务器创建一个简单的 Flutter 项目。 你可以根据自己的需要修改和扩展此项目。 关键在于将 UI 逻辑与业务逻辑分离,并使用 MCP 服务器来处理业务逻辑。 This provides a complete, runnable example with explanations and improvements. Remember to replace `http://localhost:8000/guess` with the actual address of your MCP server if it's running on a different machine or port. Good luck!
Trello MCP Server
Enables AI assistants to retrieve Trello card information by ID or link, providing access to card details including labels, members, due dates, and attachments through a standardized interface.
mcp-lucene-server
mcp-lucene-server