Docker MCP Server
An MCP server that allows managing Docker containers through natural language, enabling users to compose, introspect, and debug containers without running commands themselves.
ckreiling
README
🐋 Docker MCP server
An MCP server for managing Docker with natural language!
🪩 What can it do?
- 🚀 Compose containers with natural language
- 🔍 Introspect & debug running containers
- 📀 Manage persistent data with Docker volumes
❓ Who is this for?
- Server administrators: connect to remote Docker engines for e.g. managing a public-facing website.
- Tinkerers: run containers locally and experiment with open-source apps supporting Docker.
- AI enthusiasts: push the limits of that an LLM is capable of!
Demo
A quick demo showing a WordPress deployment using natural language:
https://github.com/user-attachments/assets/65e35e67-bce0-4449-af7e-9f4dd773b4b3
🏎️ Quickstart
Install
Claude Desktop
On MacOS: ~/Library/Application\ Support/Claude/claude_desktop_config.json
On Windows: %APPDATA%/Claude/claude_desktop_config.json
<details> <summary>Install from PyPi with uv</summary>
If you don't have uv
installed, follow the installation instructions for your
system:
link
Then add the following to your MCP servers file:
"mcpServers": {
"mcp-server-docker": {
"command": "uvx",
"args": [
"mcp-server-docker"
]
}
}
</details>
<details> <summary>Install with Docker</summary>
Purely for convenience, the server can run in a Docker container.
After cloning this repository, build the Docker image:
docker build -t mcp-server-docker .
And then add the following to your MCP servers file:
"mcpServers": {
"mcp-server-docker": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"-v",
"/var/run/docker.sock:/var/run/docker.sock",
"mcp-server-docker:latest"
]
}
}
Note that we mount the Docker socket as a volume; this ensures the MCP server can connect to and control the local Docker daemon.
</details>
📝 Prompts
🎻 docker_compose
Use natural language to compose containers. See above for a demo.
Provide a Project Name, and a description of desired containers, and let the LLM do the rest.
This prompt instructs the LLM to enter a plan+apply
loop. Your interaction
with the LLM will involve the following steps:
- You give the LLM instructions for which containers to bring up
- The LLM calculates a concise natural language plan and presents it to you
- You either:
- Apply the plan
- Provide the LLM feedback, and the LLM recalculates the plan
Examples
- name:
nginx
, containers: "deploy an nginx container exposing it on port 9000" - name:
wordpress
, containers: "deploy a WordPress container and a supporting MySQL container, exposing Wordpress on port 9000"
Resuming a Project
When starting a new chat with this prompt, the LLM will receive the status of
any containers, volumes, and networks created with the given project name
.
This is mainly useful for cleaning up, in-case you lose a chat that was responsible for many containers.
📔 Resources
The server implements a couple resources for every container:
- Stats: CPU, memory, etc. for a container
- Logs: tail some logs from a container
🔨 Tools
Containers
list_containers
create_container
run_container
recreate_container
start_container
fetch_container_logs
stop_container
remove_container
Images
list_images
pull_image
push_image
build_image
remove_image
Networks
list_networks
create_network
remove_network
Volumes
list_volumes
create_volume
remove_volume
🚧 Disclaimers
Sensitive Data
DO NOT CONFIGURE CONTAINERS WITH SENSITIVE DATA. This includes API keys, database passwords, etc.
Any sensitive data exchanged with the LLM is inherently compromised, unless the LLM is running on your local machine.
If you are interested in securely passing secrets to containers, file an issue on this repository with your use-case.
Reviewing Created Containers
Be careful to review the containers that the LLM creates. Docker is not a secure sandbox, and therefore the MCP server can potentially impact the host machine through Docker.
For safety reasons, this MCP server doesn't support sensitive Docker options
like --privileged
or --cap-add/--cap-drop
. If these features are of interest
to you, file an issue on this repository with your use-case.
🛠️ Configuration
This server uses the Python Docker SDK's from_env
method. For configuration
details, see
the documentation.
💻 Development
Prefer using Devbox to configure your development environment.
See the devbox.json
for helpful development commands.
After setting up devbox you can configure your Claude MCP config to use it:
"docker": {
"command": "/path/to/repo/.devbox/nix/profile/default/bin/uv",
"args": [
"--directory",
"/path/to/repo/",
"run",
"mcp-server-docker"
]
},
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
MCP Package Docs Server
Facilitates LLMs to efficiently access and fetch structured documentation for packages in Go, Python, and NPM, enhancing software development with multi-language support and performance optimization.
Claude Code MCP
An implementation of Claude Code as a Model Context Protocol server that enables using Claude's software engineering capabilities (code generation, editing, reviewing, and file operations) through the standardized MCP interface.
@kazuph/mcp-taskmanager
Model Context Protocol server for Task Management. This allows Claude Desktop (or any MCP client) to manage and execute tasks in a queue-based system.
Linear MCP Server
Enables interaction with Linear's API for managing issues, teams, and projects programmatically through the Model Context Protocol.
mermaid-mcp-server
A Model Context Protocol (MCP) server that converts Mermaid diagrams to PNG images.
Jira-Context-MCP
MCP server to provide Jira Tickets information to AI coding agents like Cursor

Linear MCP Server
A Model Context Protocol server that integrates with Linear's issue tracking system, allowing LLMs to create, update, search, and comment on Linear issues through natural language interactions.

Sequential Thinking MCP Server
This server facilitates structured problem-solving by breaking down complex issues into sequential steps, supporting revisions, and enabling multiple solution paths through full MCP integration.