engram-mcp
Persistent semantic memory for AI agents. SQLite-backed, local-first, zero config. Semantic search via Ollama embeddings with keyword fallback. Tools: remember, recall, history, forget, stats.
README
@cartisien/engram-mcp
<a href="https://glama.ai/mcp/servers/Cartisien/engram-mcp"> <img width="380" height="200" src="https://glama.ai/mcp/servers/Cartisien/engram-mcp/badge" /> </a>
Persistent semantic memory for AI agents — MCP server powered by @cartisien/engram
Give any MCP-compatible AI client (Claude Desktop, Cursor, Windsurf) persistent memory that survives across sessions.
npx -y @cartisien/engram-mcp
What it does
Exposes 5 tools to any MCP client:
| Tool | Description |
|---|---|
remember |
Store a memory with automatic embedding |
recall |
Semantic search across stored memories |
history |
Recent conversation history |
forget |
Delete one memory, a session, or entries before a date |
stats |
Memory statistics for a session |
Memories are stored in SQLite. Semantic search uses local Ollama embeddings (nomic-embed-text) — no API key, no cloud. Falls back to keyword search if Ollama isn't available.
Quick Start
Claude Desktop
Add to ~/Library/Application Support/Claude/claude_desktop_config.json:
{
"mcpServers": {
"engram": {
"command": "npx",
"args": ["-y", "@cartisien/engram-mcp"],
"env": {
"ENGRAM_DB": "~/.engram/memory.db"
}
}
}
}
Restart Claude Desktop. You'll see remember, recall, history, forget, and stats available as tools.
Cursor / Windsurf
Add to your MCP config:
{
"mcpServers": {
"engram": {
"command": "npx",
"args": ["-y", "@cartisien/engram-mcp"]
}
}
}
Configuration
| Env Var | Default | Description |
|---|---|---|
ENGRAM_DB |
~/.engram/memory.db |
SQLite database path |
ENGRAM_EMBEDDING_URL |
http://localhost:11434 |
Ollama base URL for embeddings |
Local Embeddings (Recommended)
Install Ollama and pull the embedding model:
ollama pull nomic-embed-text
Semantic search activates automatically. Without Ollama, keyword search is used.
Example Usage
Once connected, your agent can:
remember(sessionId="myagent", content="User prefers TypeScript over JavaScript", role="user")
recall(sessionId="myagent", query="what are the user's coding preferences?", limit=5)
# Returns: [{ content: "User prefers TypeScript...", similarity: 0.82 }, ...]
history(sessionId="myagent", limit=10)
stats(sessionId="myagent")
# { total: 42, byRole: { user: 20, assistant: 22 }, withEmbeddings: 42 }
Part of the Cartisien Memory Suite
@cartisien/engram— core memory SDK@cartisien/engram-mcp— this package, MCP server@cartisien/extensa— vector infrastructure (coming soon)@cartisien/cogito— agent identity & lifecycle (coming soon)
MIT © Cartisien Interactive
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.