Munin AI Memory
Munin is a high-performance, pragmatic memory layer for AI agents (Cursor, Claude Code, OpenClaw, Gemini CLI,...). Unlike other solutions, Munin focuses on developer productivity with: * Multi-Project Support: Isolate memories into separate "brains" (Context Cores). * GraphRAG: Automatically builds a knowledge graph from your context. * Sub-200ms Search: Blazing fast Hybrid & Semantic
README
🧠 Munin Ecosystem for AI Agents
Give your AI Agents a robust, Long-Term Memory.
Have you ever been frustrated when your AI agent forgets the architectural decisions you made yesterday? Or when it repeats the exact same bug it fixed in the previous session?
Munin is a Full-Stack Long-Term Memory manager powered by GraphRAG. This monorepo contains the official Model Context Protocol (MCP) adapters and SDKs to connect Munin Context Cores to your favorite AI tools—allowing them to build, query, and maintain a persistent knowledge graph of your entire project across endless sessions.
✨ Feature Highlights
Munin isn't just a database; it's a cognitive layer for your AI agents:
- 🛡️ AI Memory Guard: Detects semantic contradictions in your agent's memory to ensure consistency.
- 🕸️ GraphRAG Visualizer: Auto-extracts entities and relationships into interactive 2D neural knowledge graphs and Mermaid-compatible diagrams.
- ⚡ Lower Token Costs: Semantic hybrid search (Vector + Keyword) ensures agents pull only the most relevant snippets, keeping prompts lean and fast.
- 🔐 E2EE With GraphRAG: Industry-leading security. Encrypt your memory end-to-end while maintaining the ability to perform high-performance semantic search (Elite Tier).
- 🕒 Temporal Search: Search by time context—ask "what did we decide last Tuesday?" and get exact answers.
- 📌 Dynamic Pinning: Force-inject critical project context (like coding standards or core architecture) so AI never loses the "big picture".
- 🤝 Cross-Project Sharing: Share selected memories across different projects to reuse logic and context without manual copy-pasting.
- ⌛ Memory TTL: Set expiration windows for temporary context to keep your memory cores clean and noise-free.
🔌 Supported Adapters
This ecosystem provides first-class, plug-and-play MCP adapters for the most popular AI development tools. Choose your platform to get started:
- Claude Code (
@kalera/munin-claude) - Cursor & Windsurf (
@kalera/munin-cursor) - Gemini CLI (
@kalera/munin-gemini) - OpenClaw (
@kalera/munin-openclaw) - Kilo (
@kalera/munin-kilo) - Antigravity (
@kalera/munin-antigravity)
📦 Monorepo Structure
This repository is organized as a pnpm workspace containing the core SDKs, the protocol specification, and all individual adapters:
- Protocol Spec:
packages/spec - TypeScript SDK:
packages/ts-sdk - Python SDK:
packages/python-sdk - First-Class Adapters:
adapters/* - Generic MCP Template:
adapters/generic-mcp-template - Contract Test Harness:
tests/contract - Release Tag Mapping:
docs/release-tags.md
🛠️ Developer Guide
If you are contributing to the Munin Ecosystem, use the following commands to manage the monorepo.
Quick Commands
pnpm install
pnpm lint
pnpm build
pnpm test
pnpm test:contract
Contract Test
Start the mock server (default 4010):
pnpm test:contract:mock
If the port is occupied, run on another port:
MUNIN_CONTRACT_PORT=4011 pnpm test:contract:mock
MUNIN_CONTRACT_PORT=4011 pnpm test:contract
You can also override the full base URL directly:
MUNIN_CONTRACT_BASE_URL=http://127.0.0.1:4011 pnpm test:contract
By default, the contract runner uses:
tests/contract/adapter-manifests/munin-sdk-local.json
Override with a custom manifest:
pnpm test:contract -- tests/contract/adapter-manifests/<manifest>.json
Built with ❤️ by Kalera for the AI Engineering community.
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.