
MindManager MCP Server
A Model Context Protocol server that enables LLMs to interact with MindManager mind maps, allowing retrieval of mind map structures and export to formats like Mermaid, Markdown, and JSON.
Tools
get_mindmap
Retrieves the current mind map structure from MindManager. Args: mode (str): Detail level ('full', 'content', 'text'). Defaults to 'full'. turbo_mode (bool): Enable turbo mode (text only). Defaults to False. Returns: Dict[str, Any]: Serialized mind map structure or error dictionary.
get_selection
Retrieves the currently selected topics in MindManager. Args: mode (str): Detail level ('full', 'content', 'text'). Defaults to 'full'. turbo_mode (bool): Enable turbo mode (text only). Defaults to False. Returns: Union[List[Dict[str, Any]], Dict[str, str]]: List of serialized selected topics or error dictionary.
get_library_folder
Gets the path to the MindManager library folder. Returns: Union[str, Dict[str, str]]: The library folder path or error dictionary.
get_mindmanager_version
Gets the version of the MindManager application. Returns: Union[str, Dict[str, str]]: The version of the MindManager application or error dictionary.
get_grounding_information
Extracts grounding information (central topic, selected subtopics) from the mindmap. Args: mode (str): Detail level ('full', 'content', 'text'). Defaults to 'full'. turbo_mode (bool): Enable turbo mode (text only). Defaults to False. Returns: Union[List[str], Dict[str, str]]: A list containing [top_most_topic, subtopics_string] or error dictionary.
serialize_current_mindmap_to_mermaid
Serializes the currently loaded mindmap to Mermaid format. Args: id_only (bool): If True, only include IDs without detailed attributes. Defaults to False. mode (str): Detail level ('full', 'content', 'text'). Defaults to 'full'. turbo_mode (bool): Enable turbo mode (text only). Defaults to False. Returns: Union[str, Dict[str, str]]: Mermaid formatted string or error dictionary.
serialize_current_mindmap_to_markdown
Serializes the currently loaded mindmap to Markdown format. Args: include_notes (bool): If True, include notes in the serialization. Defaults to True. mode (str): Detail level ('full', 'content', 'text'). Defaults to 'full'. turbo_mode (bool): Enable turbo mode (text only). Defaults to False. Returns: Union[str, Dict[str, str]]: Markdown formatted string or error dictionary.
serialize_current_mindmap_to_json
Serializes the currently loaded mindmap to a detailed JSON object with ID mapping. Args: ignore_rtf (bool): Whether to ignore RTF content. Defaults to True. mode (str): Detail level ('full', 'content', 'text'). Defaults to 'full'. turbo_mode (bool): Enable turbo mode (text only). Defaults to False. Returns: Union[Dict[str, Any], Dict[str, str]]: JSON serializable dictionary or error dictionary.
get_versions
Get the versions of the MindManager Automation MCP Server components. Returns: Dict[str, str]: A dictionary containing the versions of the components.
README
MindManager MCP Server
A Model Context Protocol (MCP) server implementation for the mindm
library, providing a standardized interface to interact with MindManager on Windows and macOS.
Overview
This server allows you to programmatically interact with MindManager through the Model Context Protocol (MCP), a standardized way to provide context and tools to LLMs. It leverages the mindm
library to manipulate MindManager documents, topics, relationships, and other mindmap elements.
Example:
Features
- Retrieve mindmap structure and central topics
- Export mindmaps to Mermaid, Markdown, JSON formats to be used in LLM chats
- Get information about MindManager installation and library folders
- Get current selection from MindManager
Planned Features
- Create new mindmaps from serialized data
- Add, modify, and manipulate topics and subtopics
- Add relationships between topics
- Add tags to topics
- Set document background images
Requirements
- Python 3.12 or higher
mcp
package (Model Context Protocol SDK)mindm
library (included in this project)- MindManager (supported versions: 23-) installed on Windows or macOS
Installation macOS
# Clone the repository (if you're using it from a repository)
git clone https://github.com/robertZaufall/mindm-mcp.git
cd mindm-mcp
# create a virtual environment for Python
brew install uv # if needed
uv pip install -r pyproject.toml
# alternative: manual installation of modules
uv add "mcp[cli]"
uv add fastmcp
uv add markdown-it-py
uv add -U --index-url=https://test.pypi.org/simple/ --extra-index-url=https://pypi.org/simple/ mindm mindm-mcp
Installation Windows
# Change to DOS command prompt
cmd
# Clone the repository (if you're using it from a repository)
git clone https://github.com/robertZaufall/mindm-mcp.git
cd mindm-mcp
# create a virtual environment for Python
pip install uv # if needed
uv pip install -r pyproject.toml
# install nodejs
choco install nodejs # if you have chocolatey installed. If not install nodejs otherwise
refreshenv
node -v
npm install -g npx
Usage
MCP inspector
# run mcp with inspector
uv run --with mind --with fastmcp --with markdown-it-py mcp dev mindm_mcp/server.py
Claude Desktop
Local python file
Adjust the path for the local file as needed.
{
"mcpServers": {
"mindm (MindManager)": {
"command": "uv",
"args": [
"run",
"--with",
"mindm>=0.0.4.6",
"--with",
"fastmcp",
"--with",
"markdown-it-py",
"/Users/master/git/mindm-mcp/mindm_mcp/server.py"
]
}
}
}
Module from package repository
Adjust VIRTUAL_ENV
as needed.
{
"mcpServers": {
"mindm (MindManager)": {
"command": "uv",
"args": [
"run",
"--with",
"mindm>=0.0.4.6",
"--with",
"mindm-mcp>=0.0.1.50",
"--with",
"fastmcp",
"--with",
"markdown-it-py",
"-m",
"mindm_mcp.server"
],
"env": {
"VIRTUAL_ENV": "/Users/master/git/mindm-mcp/.venv"
}
}
}
}
Hint: If the MCP server does not show up with the hammer icon on Windows, close Claude Desktop and kill all background processes.
MCP Tools
The server exposes the following tools through the Model Context Protocol:
Document Interaction
get_mindmap
: Retrieves the current mindmap structure from MindManagerget_selection
: Retrieves the currently selected topics in MindManagerget_library_folder
: Gets the path to the MindManager library folderget_grounding_information
: Extracts grounding information (central topic, selected subtopics) from the mindmap
Serialization
serialize_current_mindmap_to_mermaid
: Serializes the currently loaded mindmap to Mermaid formatserialize_current_mindmap_to_markdown
: Serializes the currently loaded mindmap to Markdown formatserialize_current_mindmap_to_json
: Serializes the currently loaded mindmap to a detailed JSON object with ID mapping
Platform Support
- Windows: Full support for topics, notes, icons, images, tags, links, relationships, and RTF formatting
- macOS: Support for topics, notes, and relationships (limited support compared to Windows)
Integration with Claude and other LLMs
This MCP server can be installed in Claude Desktop or other MCP-compatible applications, allowing LLMs to:
- Access mindmap content
- Manipulate mindmaps (coming)
- Create new mindmaps based on LLM-generated content (coming)
Troubleshooting
- Ensure MindManager is running before starting the server
- For macOS, make sure you allow Claude Desktop to automate MindManager
Acknowledgements
This project is built upon the mindm
library, providing Python interfaces to MindManager on Windows and macOS platforms. It uses the Model Context Protocol (MCP) SDK developed by Anthropic.
License
MIT License - See LICENSE file for details
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.