Graph Memory MCP
Enables AI agents to build and query a persistent knowledge graph with entities, relationships, and observations. Features a core index system that ensures critical information is always accessible across all memory operations.
README
bm-graph-memory-mcp
A persistent, indexed knowledge graph for AI agents, designed for MCP-compatible platforms.
This project, Brikerman Graph Memory (bm-graph-memory), provides a persistent memory layer for AI models. It allows an AI to build and query a knowledge graph composed of entities, relationships, and observations. The system is designed around a core "index" database that ensures the AI always has access to its most critical information.
Core Concepts
The main Database: The System's Core Index
The entire system revolves around the main database. It is not just a general-purpose memory; it has a specific and critical function.
- Role: The
maindatabase acts as the core index and routing table for your entire memory system. - Contents: It should contain only two types of information:
- Routing Information: A manifest of all other contexts that exist (e.g.,
work,personal). This helps the AI know what other specialized memories it can query. - Critical Data: A very small amount of absolutely essential, high-level information that the AI must always be aware of.
- Routing Information: A manifest of all other contexts that exist (e.g.,
- Constant Visibility: The entire contents of the
maindatabase are automatically appended to the results of every search operation. Whether you search globally or within a specific context, you always get the fullmaindatabase back, ensuring the AI never loses sight of its core index.
Contexts: Specialized Memories
To keep information organized, you can use contexts. A context is a separate memory file for a specific topic.
- Examples:
work,personal,project-alpha. - Primary Use: Store detailed, topic-specific information in a named context. This keeps the
maindatabase clean and focused on its core indexing role.
Storage Location
The system determines where to store and retrieve files based on the MEMORY_FOLDER environment variable:
- If
MEMORY_FOLDERis set, all memory files will be stored in that location. - If
MEMORY_FOLDERis not set, the system defaults to~/.bmin the user's home directory.
The bm_gm Safety System
A consistent naming convention ensures clarity and safety. bm_gm stands for Brikerman Graph Memory.
.bmdirectories: Identifies a folder as a Brikerman Memory storage location.bm_gm_tool prefixes: Groups all memory functions together for the AI._bm_gmsafety marker: The first line of every memory file is{"type":"_bm_gm","source":"brikerman-graph-memory-mcp"}. The system will refuse to write to any file that doesn't start with this marker, preventing data corruption.
Available AI Tools (bm_gm_*)
The AI interacts with its memory using the following tools.
Core Tools
memory_create_entities: Adds new entities (like people, places, or concepts) to the knowledge graph.memory_create_relations: Creates a labeled link between two existing entities.memory_add_observations: Adds a new piece of text information to an existing entity.memory_search_nodes: Searches for information using keywords. The results will always include the full contents of themaindatabase.memory_read_graph: Dumps the entire content of one or more databases. The results will always include the full contents of themaindatabase.
Management & Deletion Tools
memory_list_databases: Shows all available memory databases (contexts).memory_delete_entities: Removes entities from the graph.memory_delete_relations: Removes specific relationships between entities.memory_delete_observations: Removes specific observations from an entity.
Common Parameters
context(string): The named database to target (e.g.,work). If not provided, the operation targets themaindatabase.
File Organization Example
MEMORY_FOLDER directory:
/path/to/memory/folder/
├── memory.jsonl # The 'main' Database (Core Index)
├── memory-work.jsonl # Work Context
└── memory-personal.jsonl # Personal Context
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.