delegations-mcp
An MCP server that exposes a library of delegation prompts to orchestrate tasks between a primary LLM and specialized sub-agents. It enables the execution of self-contained, bounded tasks with built-in support for configuration discovery and project-specific delegation libraries.
README
delegations-mcp
An MCP server that exposes a library of delegation prompts as tools. A smart orchestrator LLM uses these tools to hand off bounded tasks to a reduced LLM or coding agent, which receives a fully-constructed, self-contained prompt requiring no broader context.
Configuration
Config is discovered by walking up from the working directory, then merged with
~/.delegations.toml. Project config extends global; library blocks merge
field-by-field (project overrides global per field).
# .delegations.toml
[agent]
executable = "copilot"
args = ["--model", "gpt-4o-mini", "--prompt", "Instructions in: {prompt_path}"]
output_dir = "/tmp" # where prompts and transcripts are written
[library.devteam]
path = "./library/devteam"
test_executable = "/usr/bin/python3"
test_args = ["-m", "pytest"]
{prompt_path} in agent args is replaced with the path to the rendered prompt file.
Running
The server must run on the host filesystem — the agent it spawns edits files directly and needs access to your project.
stdio mode — recommended, zero config
The MCP client spawns the server automatically, inheriting its working directory. Config is discovered from there. No setup required beyond installing the package.
MCP client config (e.g. Claude Desktop):
{
"mcpServers": {
"delegations": {
"command": "uv",
"args": ["run", "--directory", "/path/to/delegations-mcp", "delegations-mcp"]
}
}
}
HTTP mode — persistent session server
Useful when multiple agents share a session: the registry is loaded once, and
the async lock for lock: true delegations is shared across all connections.
Run from your project directory (so config discovery finds .delegations.toml):
cd /your/project
delegations-mcp --transport http --port 8000
# or: uv run --directory /path/to/delegations-mcp delegations-mcp --transport http
Connect your MCP client to http://localhost:8000/mcp. One server instance per
project — the working directory at startup determines which config and libraries
are used.
Tools exposed
| Tool | Description |
|---|---|
list_delegations() |
Lists available delegations; refreshes registry from disk |
get_delegation(name) |
Returns full details and merged input schema for a delegation |
run_delegation(name, inputs) |
Runs a delegation; returns summary, prompt_path, transcript_path |
Delegation names are library:delegation (e.g. devteam:implement).
Libraries
See library/devteam/README.md for the bundled
devteam library. See docs/library-implementation.md
to author your own.
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.