MCPClient Python Application

MCPClient Python Application

Okay, I understand. You want an implementation for interacting between an MCP (presumably Minecraft Protocol) server and an Ollama model. This is a complex project, so let's break it down into key components and provide a conceptual outline with code snippets. I'll focus on the core logic and leave out boilerplate code for brevity. **Conceptual Outline** 1. **Minecraft Server Interaction (MCP):** * Establish a connection to the Minecraft server. * Listen for player chat messages. * Parse the chat messages to extract relevant information. * Send commands back to the Minecraft server (e.g., `/say`, `/tell`). 2. **Ollama Model Interaction:** * Send prompts to the Ollama model. * Receive responses from the Ollama model. * Parse the responses. 3. **Bridging Logic:** * Take the parsed chat message from Minecraft. * Formulate a prompt for the Ollama model based on the chat message. * Send the prompt to Ollama. * Receive the response from Ollama. * Format the response for Minecraft. * Send the formatted response back to the Minecraft server. **Code Snippets (Python - Example)** **Important Considerations:** * **Libraries:** You'll need libraries for Minecraft protocol interaction (e.g., `mcstatus`, `nbt`, `pymclevel`, or a more comprehensive library like `python-minecraft-protocol` or `mineflayer` if you need bot-like behavior). You'll also need a library to interact with the Ollama API (e.g., `requests` or `httpx`). * **Ollama Setup:** You need to have Ollama installed and running with the desired model loaded. * **Minecraft Server:** You need a Minecraft server running and accessible. * **Error Handling:** Robust error handling is crucial for a production system. * **Security:** Be very careful about security, especially if you're allowing the Ollama model to execute commands on the Minecraft server. Sanitize inputs and limit the model's capabilities. * **Rate Limiting:** Implement rate limiting to prevent abuse and avoid overwhelming the Ollama model or the Minecraft server. * **Asynchronous Operations:** Use `asyncio` or similar for non-blocking I/O to handle multiple connections and requests efficiently. ```python import asyncio import json import requests # Or httpx # pip install requests # pip install python-minecraft-protocol from mcstatus import MinecraftServer # pip install mcstatus from mcstatus.connection import TCPAsyncSocketConnection # Minecraft Server Configuration MINECRAFT_SERVER_ADDRESS = "localhost" # Replace with your server address MINECRAFT_SERVER_PORT = 25565 # Default Minecraft port # Ollama Configuration OLLAMA_API_URL = "http://localhost:11434/api/generate" # Default Ollama API endpoint OLLAMA_MODEL = "llama2" # Replace with your desired Ollama model async def get_server_status(address, port): """Gets the status of the Minecraft server.""" server = MinecraftServer(address, port) try: status = await server.async_status() return status except Exception as e: print(f"Error getting server status: {e}") return None async def send_minecraft_command(command): """Sends a command to the Minecraft server (example using rcon).""" # This is a placeholder. You'll need to implement RCON or another method # to send commands to the server. RCON is generally preferred. # Example using an RCON library (e.g., mcrcon): # from mcrcon import MCRcon # with MCRcon(MINECRAFT_SERVER_ADDRESS, RCON_PORT, RCON_PASSWORD) as mcr: # resp = mcr.command(command) # print(resp) print(f"Simulating sending command to Minecraft: {command}") pass # Replace with actual implementation async def send_to_ollama(prompt): """Sends a prompt to the Ollama model and returns the response.""" payload = { "prompt": prompt, "model": OLLAMA_MODEL, "stream": False # Set to True for streaming responses } try: response = requests.post(OLLAMA_API_URL, json=payload, stream=False) # stream=False for complete response response.raise_for_status() # Raise HTTPError for bad responses (4xx or 5xx) data = response.json() return data["response"] except requests.exceptions.RequestException as e: print(f"Error sending to Ollama: {e}") return None async def process_minecraft_chat(chat_message): """Processes a Minecraft chat message and interacts with Ollama.""" print(f"Received chat message: {chat_message}") # 1. Formulate a prompt for Ollama ollama_prompt = f"Respond to this Minecraft chat message: {chat_message}" # 2. Send the prompt to Ollama ollama_response = await send_to_ollama(ollama_prompt) if ollama_response: print(f"Ollama response: {ollama_response}") # 3. Format the response for Minecraft minecraft_response = f"Ollama: {ollama_response}" # 4. Send the response back to Minecraft await send_minecraft_command(f"/say {minecraft_response}") # Or /tell <player> else: print("Ollama request failed.") async def minecraft_chat_listener(): """Listens for chat messages from the Minecraft server.""" # This is a placeholder. You'll need to implement a proper Minecraft # client to listen for chat messages. This is significantly more complex # than just getting the server status. Libraries like # `python-minecraft-protocol` or `mineflayer` are essential. # Example (very simplified and incomplete - DO NOT USE IN PRODUCTION): # This just simulates receiving chat messages. while True: # Simulate receiving a chat message await asyncio.sleep(5) # Check every 5 seconds (adjust as needed) simulated_chat = input("Enter simulated chat message: ") await process_minecraft_chat(simulated_chat) async def main(): """Main function to start the Minecraft and Ollama interaction.""" status = await get_server_status(MINECRAFT_SERVER_ADDRESS, MINECRAFT_SERVER_PORT) if status: print(f"Minecraft server status: {status.description}") asyncio.create_task(minecraft_chat_listener()) # Start listening for chat else: print("Failed to get Minecraft server status. Exiting.") return # Keep the main task running while True: await asyncio.sleep(1) if __name__ == "__main__": asyncio.run(main()) ``` **Explanation and Key Improvements:** * **`mcstatus` for Server Status:** Uses `mcstatus` to check the server's status. This is a good starting point to verify the server is running. * **`requests` for Ollama:** Uses the `requests` library to interact with the Ollama API. This is a simple and common way to make HTTP requests. Consider `httpx` for async. * **`send_to_ollama` Function:** Encapsulates the logic for sending prompts to Ollama and handling responses. Includes error handling. * **`process_minecraft_chat` Function:** Handles the core logic of taking a Minecraft chat message, formulating a prompt, sending it to Ollama, and sending the response back to Minecraft. * **`send_minecraft_command` Placeholder:** A placeholder function for sending commands to the Minecraft server. **You MUST replace this with a proper implementation using RCON or another suitable method.** * **`minecraft_chat_listener` Placeholder:** A placeholder function for listening for chat messages. **This is the most complex part and requires a full Minecraft client implementation.** Libraries like `python-minecraft-protocol` or `mineflayer` are essential. The example provided is just a simulation. * **Asynchronous Operations:** Uses `asyncio` for concurrent operations. This is important for handling multiple connections and requests efficiently. * **Error Handling:** Includes basic error handling for network requests and server status checks. Expand this for production use. * **Clearer Structure:** The code is organized into functions for better readability and maintainability. * **Comments:** Includes comments to explain the purpose of each section of the code. * **Important Considerations:** Highlights the critical aspects of the project, such as security, rate limiting, and library choices. **Next Steps (Crucial):** 1. **Implement Minecraft Chat Listener:** This is the most challenging part. Use a library like `python-minecraft-protocol` or `mineflayer` to create a Minecraft client that can connect to the server, authenticate, and listen for chat messages. This will involve understanding the Minecraft protocol. 2. **Implement Minecraft Command Sender:** Implement the `send_minecraft_command` function using RCON (Remote Console) or another suitable method to send commands to the Minecraft server. RCON is generally the preferred approach. You'll need to enable RCON on your Minecraft server and configure the RCON port and password. 3. **Security:** Thoroughly review the security implications of allowing an AI model to interact with your Minecraft server. Sanitize inputs, limit the model's capabilities, and implement appropriate access controls. 4. **Testing:** Test the integration thoroughly in a safe environment before deploying it to a production server. 5. **Refinement:** Refine the prompts and responses to improve the quality of the interaction between the AI model and the Minecraft players. **Example `python-minecraft-protocol` (Partial - Requires More Setup):** ```python # Requires: pip install python-minecraft-protocol from minecraft_protocol.connection import Connection import asyncio async def handle_chat(packet): """Handles incoming chat messages.""" chat_data = json.loads(packet.json_data) message = chat_data["text"] if "extra" in chat_data: for extra in chat_data["extra"]: if isinstance(extra, str): message += extra elif isinstance(extra, dict) and "text" in extra: message += extra["text"] await process_minecraft_chat(message) async def minecraft_chat_listener(): """Listens for chat messages using python-minecraft-protocol.""" conn = Connection( host=MINECRAFT_SERVER_ADDRESS, port=MINECRAFT_SERVER_PORT, username="YourBotUsername", # Replace with your bot's username password="YourBotPassword" # If applicable ) conn.register_packet_listener(handle_chat, "chat.incoming") try: await conn.connect() await conn.wait_for_disconnect() # Keep the connection alive except Exception as e: print(f"Error connecting to Minecraft server: {e}") # Replace the placeholder in main() with: # asyncio.create_task(minecraft_chat_listener()) ``` **Important Notes about `python-minecraft-protocol`:** * **Authentication:** You'll need to handle Minecraft authentication correctly. This can be complex, especially for premium (paid) Minecraft accounts. Consider using a library that simplifies authentication. * **Protocol Version:** Ensure that the library you use supports the correct Minecraft protocol version for your server. * **Error Handling:** Implement robust error handling to deal with connection issues, authentication failures, and other potential problems. This comprehensive outline and the code snippets should provide a solid foundation for building your MCP-Ollama integration. Remember to prioritize security, error handling, and thorough testing throughout the development process. Good luck!

spirita1204

Developer Tools
Visit Server

README

MCPClient Pythonアプリケーション

これは、MCP(Model Context Protocol)サーバーと連携するように設計されたPythonクライアントアプリケーションです。

特徴

  • 非同期通信: クライアントとサーバー間のノンブロッキング通信にasyncioを使用します。
  • カスタマイズ可能なサーバー スクリプト: クライアントは、PythonとJavaScriptベースのサーバー スクリプトの両方に接続できます。
  • ツール管理: 接続されたサーバーで利用可能なツールを動的に取得し、連携します。
  • チャット インターフェース: 会話形式でサーバーとやり取りするためのシンプルなコマンドライン インターフェースを提供します。
  • ツール統合: サーバーの応答からJSON形式のツール呼び出しを抽出し、実行することをサポートします。
  • 環境変数ロード: dotenvパッケージを使用して、.envファイルから環境変数をロードすることをサポートします。

必要条件

  • Python 3.7以上
  • asyncioライブラリ(Pythonに付属)
  • サーバーへのHTTPリクエストのためのrequests
  • mcp(MCP通信を処理するためのカスタムライブラリ)
  • 環境変数管理のためのdotenv

セットアップ

  1. リポジトリをクローンする(またはスクリプトファイルをダウンロードする)ローカルマシンに。

  2. 必要な依存関係をインストールする:

    pip install -r requirements.txt
    
  3. .envファイルを作成する ルートディレクトリに、必要な環境変数をロードするために。 例:

    BASE_URL=http://localhost:11434
    MODEL=llama3.2
    
  4. クライアントを実行する サーバー スクリプトへのパスを指定して:

    python client.py <server_script_path>
    

    サーバー スクリプトは、Pythonの.pyファイルまたはJavaScriptの.jsファイルにすることができます。

仕組み

  1. MCPサーバーへの接続: クライアントは、提供されたスクリプト(.pyまたは.js)を使用して、標準入出力チャネル経由でサーバーに接続します。
  2. クエリの処理: クライアントはユーザーのクエリをサーバーに送信し、応答を受信します。 利用可能なツールが一覧表示され、アシスタントの応答から直接呼び出すことができます。
  3. ツールの実行: 応答に有効なツール呼び出し(JSON形式)が含まれている場合、クライアントはその呼び出しを抽出し、サーバー上で対応するツールをトリガーします。
  4. インタラクション: クライアントは会話形式でサーバーとやり取りし、サーバーツールからの結果を表示し、会話を継続します。

ワークフローの例

  1. ユーザーは次のようなクエリを入力します。

    Question: 今日の天気は何ですか?
    
  2. クライアントはクエリをサーバーに送信し、サーバーは利用可能なツールと情報で応答します。

  3. サーバーが天気ツールを使用することを提案した場合、クライアントは必要なパラメータを使用してツールを実行し、結果を表示します。

  4. クライアントは、ツールから返された新しい情報に基づいて会話を継続します。 https://github.com/furey/mongodb-lens

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
@kazuph/mcp-taskmanager

@kazuph/mcp-taskmanager

Model Context Protocol server for Task Management. This allows Claude Desktop (or any MCP client) to manage and execute tasks in a queue-based system.

Featured
Local
JavaScript
Claude Code MCP

Claude Code MCP

An implementation of Claude Code as a Model Context Protocol server that enables using Claude's software engineering capabilities (code generation, editing, reviewing, and file operations) through the standardized MCP interface.

Featured
Local
JavaScript
MCP Package Docs Server

MCP Package Docs Server

Facilitates LLMs to efficiently access and fetch structured documentation for packages in Go, Python, and NPM, enhancing software development with multi-language support and performance optimization.

Featured
Local
TypeScript
Linear MCP Server

Linear MCP Server

A Model Context Protocol server that integrates with Linear's issue tracking system, allowing LLMs to create, update, search, and comment on Linear issues through natural language interactions.

Featured
JavaScript
Sequential Thinking MCP Server

Sequential Thinking MCP Server

This server facilitates structured problem-solving by breaking down complex issues into sequential steps, supporting revisions, and enabling multiple solution paths through full MCP integration.

Featured
Python
mermaid-mcp-server

mermaid-mcp-server

A Model Context Protocol (MCP) server that converts Mermaid diagrams to PNG images.

Featured
JavaScript
Jira-Context-MCP

Jira-Context-MCP

MCP server to provide Jira Tickets information to AI coding agents like Cursor

Featured
TypeScript
Linear MCP Server

Linear MCP Server

Enables interaction with Linear's API for managing issues, teams, and projects programmatically through the Model Context Protocol.

Featured
JavaScript