Kustomize MCP
An MCP server designed to help AI models refactor Kubernetes configurations by analyzing Kustomize dependencies and rendering manifest diffs across environments. It provides tools for computing file dependencies, rendering overlays, and comparing configuration changes through a checkpointing system.
README
Kustomize MCP
An MCP server that helps to refactor Kubernetes configuration based on Kustomize.
Why? Because Kustomize manifests depend on each other in non-obvious ways, it's hard for a model to understand how a config change may impact multiple environments. This MCP server gives them extra tools to make this safer:
- Compute dependencies of a manifest
- Render the end result of Kustomize overlays
- Provide full and summarized diffs between overlays across directories and checkpoints.
Available Tools
create_checkpoint: Creates a checkpoint where rendered configuration will be stored.clear_checkpoint: Clears all checkpoints or a specific checkpointrender: Renders Kustomize configuration and saves it in a checkpointdiff_checkpoints: Compares all rendered configuration across two checkpointsdiff_paths: Compares two Kustomize configurations rendered in the same checkpointdependencies: Returns dependencies for a Kustomization file
Running the Server
[!NOTE] This requires access to your local file system, similarly to how the filesystem MCP Server works.
Using Docker
Run the server in a container (using the pre-built image):
docker run -i --rm -v "$(pwd):/workspace" ghcr.io/mbrt/kustomize-mcp:latest
The Docker image includes:
- Python 3.13 with all project dependencies
- kustomize (latest stable)
- helm (latest stable)
- git
Mount your Kustomize configurations to the /workspace directory in the
container to work with them.
If you want to rebuild the image from source:
docker build -t my-kustomize-mcp:latest .
And use that image instead of ghcr.io/mbrt/kustomize-mcp.
Using UV (Local Development)
Start the MCP server:
uv run server.py
The server will start by using the STDIO transport.
Usage with MCP clients
To integrate with VS Code, add the configuration to your user-level MCP
configuration file. Open the Command Palette (Ctrl + Shift + P) and run MCP: Open User Configuration. This will open your user mcp.json file where you can
add the server configuration.
{
"servers": {
"kustomize": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"--mount", "type=bind,src=${workspaceFolder},dst=/workspace",
"ghcr.io/mbrt/kustomize-mcp:latest"
]
}
}
}
To integrate with Claude Code, add this to your claude_desktop_config.json:
{
"mcpServers": {
"kustomize": {
"command": "docker",
"args": [
"run",
"--rm",
"-i",
"-a", "stdin",
"-a", "stdout",
"-v", "<PROJECT_DIR>:/workspace",
"ghcr.io/mbrt/kustomize-mcp:latest"
]
}
}
}
Replace <PROJECT_DIR> with the root directory of your project.
To integrate with Gemini CLI, edit .gemini/settings.json:
{
"mcpServers": {
"kustomize": {
"command": "docker",
"args": [
"run",
"--rm",
"-i",
"-a", "stdin",
"-a", "stdout",
"-v", "${PWD}:/workspace",
"ghcr.io/mbrt/kustomize-mcp:latest"
]
}
}
}
Testing the Server
Run unit tests:
pytest
After running the server on one shell, use the dev tool to verify the server is working:
uv run mcp dev ./server.py
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
E2B
Using MCP to run code via e2b.