PlanExe
Converts a plain-english goal into a structured plan document: Gantt chart, risk, stakeholders, SWOT, and role descriptions. A detailed first draft - expect to verify budgets and timelines before use.
README
PlanExe
<p align="center"> <img src="docs/planexe-humanoid-factory.gif?raw=true" alt="PlanExe - Turn your idea into a comprehensive plan in minutes, not months." width="700"> </p>
<p align="center"> <strong>Turn your idea into a comprehensive plan in minutes, not months.</strong> </p>
<p align="center"> <strong>PlanExe is the premier planning tool for AI agents.</strong> </p>
<p align="center"> <a href="https://home.planexe.org/"><strong>Create an account</strong></a> | <a href="https://app.mach-ai.com/planexe_early_access"><strong>Generate a free plan</strong></a> | <a href="https://docs.planexe.org/getting_started/"><strong>Getting started guide</strong></a> </p>
Example plans generated with PlanExe
- A business plan for a Minecraft-themed escape room.
- A business plan for a Faraday cage manufacturing company.
- A pilot project for a Human as-a Service.
- See more examples here.
What is PlanExe?
PlanExe is an open-source tool and the premier planning tool for AI agents. It turns a single plain-english goal statement into a 40-page, strategic plan in ~15 minutes using local or cloud models. It's an accelerator for outlines, but no silver bullet for polished plans.
Typical output contains:
- Executive summary
- Gantt chart
- Governance structure
- Role descriptions
- Stakeholder maps
- Risk registers
- SWOT analyses
PlanExe produces well-structured, domain-aware output: correct terminology, logical task sequencing, and coherent sections. For technical topics (engineering programs, regulated industries), it often gets the vocabulary and structure right. Think of it as a first-draft scaffold that gives you something concrete to critique and refine.
However, the output has consistent weaknesses that matter: budgets are assumed rather than derived, timeline estimates are not grounded in real resource constraints, risk mitigations tend toward generic advice, and legal/regulatory details are plausible-sounding but unverified. The output should be treated as a structured starting point, not a deliverable. How much work it saves depends heavily on the project. For brainstorming or a first outline, it can save hours. For a client-ready plan, expect significant rework on every number, timeline, and risk section.
Model Context Protocol (MCP)
PlanExe exposes an MCP server for AI agents at https://mcp.planexe.org/
Assuming you have an MCP-compatible client (Claude, Cursor, Codex, LM Studio, Windsurf, OpenClaw, Antigravity).
The Tool workflow
example_plans(optional, preview what PlanExe output looks like)example_promptsmodel_profiles(optional, helps choosemodel_profile)- non-tool step: draft/approve prompt
plan_createplan_status(poll every 5 minutes until done)- optional if failed:
plan_retry - download the result via
plan_downloador viaplan_file_info
Concurrency note: each plan_create call returns a new plan_id; server-side global per-client concurrency is not capped, so clients should track their own parallel plans.
Option A: Remote MCP (fastest path)
Prerequisites
- An account at https://home.planexe.org.
- Sufficient funds to create plans.
- A PlanExe API key (
pex_...) from your account
Use this endpoint directly in your MCP client:
{
"mcpServers": {
"planexe": {
"url": "https://mcp.planexe.org/mcp",
"headers": {
"X-API-Key": "pex_your_api_key_here"
}
}
}
}
Option B: Remote MCP + local downloads via proxy (mcp_local)
If you want artifacts saved directly to your disk from your MCP client, run the local proxy:
{
"mcpServers": {
"planexe": {
"command": "uv",
"args": [
"run",
"--with",
"mcp",
"/absolute/path/to/PlanExe/mcp_local/planexe_mcp_local.py"
],
"env": {
"PLANEXE_URL": "https://mcp.planexe.org/mcp",
"PLANEXE_MCP_API_KEY": "pex_your_api_key_here",
"PLANEXE_PATH": "/absolute/path/for/downloads"
}
}
}
}
Option C: Run MCP server locally with Docker
Prerequisites
- Docker
- OpenRouter account
- Create a PlanExe
.envfile withOPENROUTER_API_KEY.
Start the full stack:
docker compose up --build
Make sure that you can create plans in the web interface, before proceeding to MCP.
Then connect your client to:
http://localhost:8001/mcp
For local docker defaults, auth is disabled in docker-compose.yml.
Local file downloads via proxy (mcp_local)
If you want artifacts saved directly to your disk from your MCP client, run the local proxy:
{
"mcpServers": {
"planexe": {
"command": "uv",
"args": [
"run",
"--with",
"mcp",
"/absolute/path/to/PlanExe/mcp_local/planexe_mcp_local.py"
],
"env": {
"PLANEXE_URL": "http://localhost:8001/mcp/",
"PLANEXE_PATH": "/absolute/path/for/downloads"
}
}
}
}
MCP docs
- Setup overview: https://docs.planexe.org/mcp/mcp_setup/
- Tool details and flow: https://docs.planexe.org/mcp/mcp_details/
- Claude: https://docs.planexe.org/mcp/claude/
- Cursor: https://docs.planexe.org/mcp/cursor/
- Codex: https://docs.planexe.org/mcp/codex/
- PlanExe MCP interface: https://docs.planexe.org/mcp/planexe_mcp_interface/
- MCP Registry publishing metadata (
server.json):mcp_cloud/server.json llms.txt: https://mcp.planexe.org/llms.txt
Running the Pipeline via CLI (no frontend)
If you have a local Python environment set up and want to invoke the pipeline
directly — without the Gradio or Flask UI — use the planexe CLI script at
the repo root.
# Create a plan from a text prompt
./planexe create_plan \
--plan-text "Small coffee shop in Copenhagen, Denmark" \
--output-dir ./planexe-outputs/1984-12-31/MyCoffeeShop_v1
# Or: read the plan prompt from a file
./planexe create_plan \
--plan-file my_plan.txt \
--output-dir ./planexe-outputs/1984-12-31/MyCoffeeShop_v1
The script creates the run directory if it does not exist and writes:
| File | Contents |
|---|---|
001-1-start_time.json |
{"server_iso_utc": "<current UTC ISO timestamp>"} |
001-2-plan.txt |
The plain-text plan prompt |
For AI Agents
PlanExe is designed as infrastructure for AI agents. If you are an AI agent reading this:
- Discovery: Read
public/llms.txtfor service metadata, MCP tools, and endpoints. - MCP endpoint:
https://mcp.planexe.org/mcp— connect with an API key from home.planexe.org. - Workflow: Call
example_promptsto learn the expected prompt format, draft a detailed prompt (~300-800 words of flowing prose), then callplan_create. - Agent guide: See
docs/mcp/autonomous_agent_guide.mdfor the complete autonomous workflow. - Key outputs in zip:
018-2-wbs_level1.json(work packages),018-5-wbs_level2.json(tasks),004-2-pre_project_assessment.json(feasibility).
<details> <summary><strong> Run locally with Docker (Click to expand)</strong></summary>
<br>
Prerequisite: Docker with Docker Compose installed; you only need basic Docker knowledge. No local Python setup is required because everything runs in containers.
Quickstart: single-user UI + worker (frontend_single_user + worker_plan)
- Clone the repo and enter it:
git clone https://github.com/PlanExeOrg/PlanExe.git
cd PlanExe
-
Provide an LLM provider. Copy
.env.docker-exampleto.envand fill inOPENROUTER_API_KEYwith your key from OpenRouter. The containers mount.envandllm_config/; pick a model profile there. For host-side Ollama, use thedocker-ollama-llama3.1entry and ensure Ollama is listening onhttp://host.docker.internal:11434. -
Start the stack (first run builds the images):
docker compose up worker_plan frontend_single_user
The worker listens on http://localhost:8000 and the UI comes up on http://localhost:7860 after the worker healthcheck passes.
- Open http://localhost:7860 in your browser. Optional: set
PLANEXE_PASSWORDin.envto require a password. Enter your idea, click the generate button, and watch progress with:
docker compose logs -f worker_plan
Outputs are written to run/ on the host (mounted into both containers).
- Stop with
Ctrl+C(ordocker compose down). Rebuild after code/dependency changes:
docker compose build --no-cache worker_plan frontend_single_user
For compose tips, alternate ports, or troubleshooting, see docs/docker.md or docker-compose.md.
Configuration
Config A: Run a model in the cloud using a paid provider. Follow the instructions in OpenRouter.
Config B: Run models locally on a high-end computer. Follow the instructions for either Ollama or LM Studio. When using host-side tools with Docker, point the model URL at the host (for example http://host.docker.internal:11434 for Ollama).
Recommendation: I recommend Config A as it offers the most straightforward path to getting PlanExe working reliably.
</details>
<details> <summary><strong> Screenshots (Click to expand)</strong></summary>
<br>
You input a vague description of what you want and PlanExe outputs a plan.
YouTube video: Using PlanExe to plan a lunar base

</details>
<details> <summary><strong> Help (Click to expand)</strong></summary>
<br>
For help or feedback.
Join the PlanExe Discord.
</details>
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.