blender-open-mcp
A server that integrates Blender with local AI models via the Model Context Protocol, allowing users to control Blender using natural language prompts for 3D modeling tasks.
dhakalnirajan
README
blender-open-mcp
blender-open-mcp
is an open source project that integrates Blender with local AI models (via Ollama) using the Model Context Protocol (MCP). This allows you to control Blender using natural language prompts, leveraging the power of AI to assist with 3D modeling tasks.
Features
- Control Blender with Natural Language: Send prompts to a locally running Ollama model to perform actions in Blender.
- MCP Integration: Uses the Model Context Protocol for structured communication between the AI model and Blender.
- Ollama Support: Designed to work with Ollama for easy local model management.
- Blender Add-on: Includes a Blender add-on to provide a user interface and handle communication with the server.
- PolyHaven Integration (Optional): Download and use assets (HDRIs, textures, models) from PolyHaven directly within Blender via AI prompts.
- Basic 3D Operations:
- Get Scene and Object Info
- Create Primitives
- Modify and delete objects
- Apply materials
- Render Support: Render images using the tool and retrieve information based on the output.
Installation
Prerequisites
- Blender: Blender 3.0 or later. Download from blender.org.
- Ollama: Install from ollama.com, following OS-specific instructions.
- Python: Python 3.10 or later.
- uv: Install using
pip install uv
. - Git: Required for cloning the repository.
Installation Steps
-
Clone the Repository:
git clone https://github.com/dhakalnirajan/blender-open-mcp.git cd blender-open-mcp
-
Create and Activate a Virtual Environment (Recommended):
uv venv source .venv/bin/activate # On Linux/macOS .venv\Scripts\activate # On Windows
-
Install Dependencies:
uv pip install -e .
-
Install the Blender Add-on:
- Open Blender.
- Go to
Edit -> Preferences -> Add-ons
. - Click
Install...
. - Select the
addon.py
file from theblender-open-mcp
directory. - Enable the "Blender MCP" add-on.
-
Download an Ollama Model (if not already installed):
ollama run llama3.2
(Other models like
Gemma3
can also be used.)
Setup
-
Start the Ollama Server: Ensure Ollama is running in the background.
-
Start the MCP Server:
blender-mcp
Or,
python src/blender_open_mcp/server.py
By default, it listens on
http://0.0.0.0:8000
, but you can modify settings:blender-mcp --host 127.0.0.1 --port 8001 --ollama-url http://localhost:11434 --ollama-model llama3.2
-
Start the Blender Add-on Server:
- Open Blender and the 3D Viewport.
- Press
N
to open the sidebar. - Find the "Blender MCP" panel.
- Click "Start MCP Server".
Usage
Interact with blender-open-mcp
using the mcp
command-line tool:
Example Commands
-
Basic Prompt:
mcp prompt "Hello BlenderMCP!" --host http://localhost:8000
-
Get Scene Information:
mcp tool get_scene_info --host http://localhost:8000
-
Create a Cube:
mcp prompt "Create a cube named 'my_cube'." --host http://localhost:8000
-
Render an Image:
mcp prompt "Render the image." --host http://localhost:8000
-
Using PolyHaven (if enabled):
mcp prompt "Download a texture from PolyHaven." --host http://localhost:8000
Available Tools
Tool Name | Description | Parameters |
---|---|---|
get_scene_info |
Retrieves scene details. | None |
get_object_info |
Retrieves information about an object. | object_name (str) |
create_object |
Creates a 3D object. | type , name , location , rotation , scale |
modify_object |
Modifies an object’s properties. | name , location , rotation , scale , visible |
delete_object |
Deletes an object. | name (str) |
set_material |
Assigns a material to an object. | object_name , material_name , color |
render_image |
Renders an image. | file_path (str) |
execute_blender_code |
Executes Python code in Blender. | code (str) |
get_polyhaven_categories |
Lists PolyHaven asset categories. | asset_type (str) |
search_polyhaven_assets |
Searches PolyHaven assets. | asset_type , categories |
download_polyhaven_asset |
Downloads a PolyHaven asset. | asset_id , asset_type , resolution , file_format |
set_texture |
Applies a downloaded texture. | object_name , texture_id |
set_ollama_model |
Sets the Ollama model. | model_name (str) |
set_ollama_url |
Sets the Ollama server URL. | url (str) |
get_ollama_models |
Lists available Ollama models. | None |
Troubleshooting
If you encounter issues:
- Ensure Ollama and the
blender-open-mcp
server are running. - Check Blender’s add-on settings.
- Verify command-line arguments.
- Refer to logs for error details.
For further assistance, visit the GitHub Issues page.
Happy Blending with AI! 🚀
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
MCP Package Docs Server
Facilitates LLMs to efficiently access and fetch structured documentation for packages in Go, Python, and NPM, enhancing software development with multi-language support and performance optimization.
Claude Code MCP
An implementation of Claude Code as a Model Context Protocol server that enables using Claude's software engineering capabilities (code generation, editing, reviewing, and file operations) through the standardized MCP interface.
@kazuph/mcp-taskmanager
Model Context Protocol server for Task Management. This allows Claude Desktop (or any MCP client) to manage and execute tasks in a queue-based system.
Linear MCP Server
Enables interaction with Linear's API for managing issues, teams, and projects programmatically through the Model Context Protocol.
mermaid-mcp-server
A Model Context Protocol (MCP) server that converts Mermaid diagrams to PNG images.
Jira-Context-MCP
MCP server to provide Jira Tickets information to AI coding agents like Cursor

Linear MCP Server
A Model Context Protocol server that integrates with Linear's issue tracking system, allowing LLMs to create, update, search, and comment on Linear issues through natural language interactions.

Sequential Thinking MCP Server
This server facilitates structured problem-solving by breaking down complex issues into sequential steps, supporting revisions, and enabling multiple solution paths through full MCP integration.