
RhinoMCP
Connects Rhino3D to Claude AI via the Model Context Protocol, enabling AI-assisted 3D modeling and design workflows through direct control of Rhino's functionality.
README
RhinoMCP
RhinoMCP connects Rhino3D to Claude AI via the Model Context Protocol (MCP), enabling Claude to directly interact with and control Rhino3D for AI-assisted 3D modeling, analysis, and design workflows.
Project Overview
This integration consists of two main components:
- Rhino Plugin: A socket server that runs inside Rhino's Python editor, providing a communication interface to Rhino's functionality.
- MCP Server: An implementation of the Model Context Protocol that connects Claude AI to the Rhino plugin, enabling AI-controlled operations.
Features
- Socket-based bidirectional communication between Python and Rhino
- Model Context Protocol server for Claude AI integration
- Support for NURBS curve creation (initial test feature)
- Python script execution within Rhino's context
- Compatible with both Claude Desktop and Windsurf as clients
Installation
Requirements
- Rhinoceros 3D (Version 7 or 8)
- Python 3.10 or higher
- Windows 10 or 11
Install Using uv (Recommended)
# Create and activate a virtual environment
mkdir -p .venv
uv venv .venv
source .venv/Scripts/activate # On Windows with Git Bash
# Install the package
uv pip install -e .
Install Using pip
# Create and activate a virtual environment
python -m venv .venv
.venv\Scripts\activate # On Windows
# Install the package
pip install -e .
Usage
Step 1: Start the Rhino Bridge Server
- Open Rhino
- Type
EditPythonScript
in the command line to open Rhino's Python editor - Open the Rhino server script from
src/rhino_plugin/rhino_server.py
- Run the script (F5 or click the Run button)
- Verify you see "Rhino Bridge started!" in the output panel
Step 2: Start the MCP Server
# Activate your virtual environment
source .venv/Scripts/activate # On Windows with Git Bash
# Start the MCP server
rhinomcp
Or run with custom settings:
rhinomcp --host 127.0.0.1 --port 5000 --rhino-host 127.0.0.1 --rhino-port 8888 --debug
Step 3: Connect with Claude Desktop or Windsurf
Configure Claude Desktop or Windsurf to connect to the MCP server at:
ws://127.0.0.1:5000
Example: Creating a NURBS Curve
When connected to Claude, you can ask it to create a NURBS curve in Rhino with a prompt like:
Create a NURBS curve in Rhino using points at (0,0,0), (5,10,0), (10,0,0), and (15,10,0).
Development
Setup Development Environment
# Clone the repository
git clone https://github.com/FernandoMaytorena/RhinoMCP.git
cd RhinoMCP
# Create and activate virtual environment
uv venv .venv
source .venv/Scripts/activate # On Windows with Git Bash
# Install development dependencies
uv pip install -e ".[dev]"
Run Tests
pytest
Code Style
This project uses Ruff for linting and formatting:
ruff check .
ruff format .
Project Structure
RhinoMCP/
├── src/
│ ├── rhino_plugin/ # Code that runs inside Rhino
│ │ └── rhino_server.py
│ └── rhino_mcp/ # MCP server implementation
│ ├── rhino_client.py
│ └── mcp_server.py
├── tests/ # Test modules
├── docs/ # Documentation
├── config/ # Configuration files
├── ai/ # AI documentation and prompts
├── setup.py # Package installation
├── requirements.txt # Package dependencies
└── README.md # Project documentation
License
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.