MEMGRAPH-MCP
A durable multi-agent orchestrator for software development with explicit run graphs, checkpoint/resume capabilities, and project memory exposed through MCP resources and tools. It enables coordinated agent workflows for coding, review, repair, CI, and approval with SQLite-backed memory retrieval and pluggable research backends.
README
Agent System
A durable multi-agent orchestrator with:
- explicit run graphs and checkpoint/resume
- orchestrator-controlled parallel delegation
- bounded research swarm execution
- coding, review, repair, CI, and approval loops
- project memory in backing stores exposed through MCP resources and tools
- a real SQLite vector index for memory retrieval
- a pluggable external research backend with Tavily support
Scope
This implementation targets the MCP 2025-11-25 spec baseline with the official Python MCP SDK and a FastMCP server for the memory surface. For local development it runs over stdio. For remote deployment, see docs/remote_auth.md.
Layout
app/runtime: run state, scheduler, orchestrator loop, checkpointingapp/planner: planning and graph revision helpersapp/agents: node executors for research, code, review, repair, CI, synthesis, approvalapp/memory: SQLite-backed memory, retrieval, and artifact indexapp/mcp_server:FastMCPresources, tools, prompts, and server entrypointtests: acceptance and unit coverage
Local usage
uv sync --group dev
uv run pytest
uv run agent-system-mcp
Retrieval and research backends
- Memory entries are indexed into a local SQLite vector table using
sqlite-vec. - The default embedding provider is
auto: it prefers a realsentence-transformersmodel and falls back to the deterministic hash provider only if the model cannot load. - Research uses an in-memory corpus backend when a node provides
inputs.corpus. - If
TAVILY_API_KEYis set, corpus-free research nodes can use the Tavily backend for external web research. - If no corpus and no Tavily key are available, research returns bounded empty findings instead of inventing sources.
Embedding configuration
AGENT_SYSTEM_EMBEDDING_PROVIDER=auto|sentence-transformers|hashAGENT_SYSTEM_EMBEDDING_MODEL=sentence-transformers/all-MiniLM-L6-v2AGENT_SYSTEM_EMBEDDING_CACHE_DIR=/path/to/cacheAGENT_SYSTEM_EMBEDDING_LOCAL_ONLY=true|false
Example:
AGENT_SYSTEM_EMBEDDING_PROVIDER=sentence-transformers uv run agent-system create-run "improve scheduler"
Local transport
Development uses the MCP stdio transport.
Remote deployment
Remote deployment is intentionally documentation-only in v1. The server documents an OAuth 2.1-compatible consent path and keeps local stdio as the default development mode.
Full documentation
- state.md: current project state, changes since creation, and roadmap
- docs/getting_started.md: fastest path to a working local run and MCP server
- docs/operator_guide.md: full system overview, operations, and best practices
- docs/developer_custom_graphs.md: Python API usage, custom graph design, and node payload reference
- docs/codex_mcp_usage.md: how Codex should use this MCP for large-app planning, memory, and checkpointed execution
- examples/plan_large_app.py: example of generating a large-app blueprint and run programmatically
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.