Knowledge Graph Memory Server
A customized MCP memory server that enables creation and management of a knowledge graph with features like custom memory paths and timestamping for capturing interactions via language models.
BRO3886
Tools
set_memory_file_path
Set the memory file path
get_current_time
Get the current time
create_entities
Create multiple new entities in the knowledge graph
create_relations
Create multiple new relations between entities in the knowledge graph. Relations should be in active voice
add_observations
Add new observations to existing entities in the knowledge graph
delete_entities
Delete multiple entities and their associated relations from the knowledge graph
delete_observations
Delete specific observations from entities in the knowledge graph
delete_relations
Delete multiple relations from the knowledge graph
read_graph
Read the entire knowledge graph
search_nodes
Search for nodes in the knowledge graph based on a query
open_nodes
Open specific nodes in the knowledge graph by their names
README
Memory Custom
This project adds new features to the Memory server offered by the MCP team. It allows for the creation and management of a knowledge graph that captures interactions via a language model (LLM).
<a href="https://glama.ai/mcp/servers/w6hi2myrxq"> <img width="380" height="200" src="https://glama.ai/mcp/servers/w6hi2myrxq/badge" alt="Memory Custom MCP server" /> </a>
New Features
1. Custom Memory Paths
- Users can now specify different memory file paths for various projects.
- Why?: This feature enhances organization and management of memory data, allowing for project-specific memory storage.
2. Timestamping
- The server now generates timestamps for interactions.
- Why?: Timestamps enable tracking of when each memory was created or modified, providing better context and history for the stored data.
Getting Started
Prerequisites
- Node.js (version 16 or higher)
Installing via Smithery
To install Knowledge Graph Memory Server for Claude Desktop automatically via Smithery:
npx -y @smithery/cli install @BRO3886/mcp-memory-custom --client claude
Installation
-
Clone the repository:
git clone git@github.com:BRO3886/mcp-memory-custom.git cd mcp-memory-custom
-
Install the dependencies:
npm install
Configuration
Before running the server, you can set the MEMORY_FILE_PATH
environment variable to specify the path for the memory file. If not set, the server will default to using memory.json
in the same directory as the script.
Running the Server
Updating the mcp server json file
Add this to your claude_desktop_config.json
/ .cursor/mcp.json
file:
{
"mcpServers": {
"memory": {
"command": "node",
"args": ["/path/to/mcp-memory-custom/dist/index.js"]
}
}
}
System Prompt changes:
Follow these steps for each interaction:
1. The memoryFilePath for this project is /path/to/memory/project_name.json - always pass this path to the memory file operations (when creating entities, relations, or retrieving memory etc.)
2. User Identification:
- You should assume that you are interacting with default_user
- If you have not identified default_user, proactively try to do so.
3. Memory Retrieval:
- Always begin your chat by saying only "Remembering..." and retrieve all relevant information from your knowledge graph
- Always refer to your knowledge graph as your "memory"
4. Memory
- While conversing with the user, be attentive to any new information that falls into these categories:
a) Basic Identity (age, gender, location, job title, education level, etc.)
b) Behaviors (interests, habits, etc.)
c) Preferences (communication style, preferred language, etc.)
d) Goals (goals, targets, aspirations, etc.)
e) Relationships (personal and professional relationships up to 3 degrees of separation)
5. Memory Update:
- If any new information was gathered during the interaction, update your memory as follows:
a) Create entities for recurring organizations, people, and significant events, add timestamps to wherever required. You can get current timestamp via get_current_time
b) Connect them to the current entities using relations
c) Store facts about them as observations, add timestamps to observations via get_current_time
IMPORTANT: Provide a helpful and engaging response, asking relevant questions to encourage user engagement. Update the memory during the interaction, if required, based on the new information gathered (point 4).
Running the Server Locally
To start the Knowledge Graph Memory Server, run:
npm run build
node dist/index.js
The server will listen for requests via standard input/output.
API Endpoints
The server exposes several tools that can be called with specific parameters:
- Get Current Time
- Set Memory File Path
- Create Entities
- Create Relations
- Add Observations
- Delete Entities
- Delete Observations
- Delete Relations
- Read Graph
- Search Nodes
- Open Nodes
Acknowledgments
- Inspired by the Memory server from Anthropic.
Recommended Servers

VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.
AIO-MCP Server
🚀 All-in-one MCP server with AI search, RAG, and multi-service integrations (GitLab/Jira/Confluence/YouTube) for AI-enhanced development workflows. Folk from
Persistent Knowledge Graph
An implementation of persistent memory for Claude using a local knowledge graph, allowing the AI to remember information about users across conversations with customizable storage location.
Hyperbrowser MCP Server
Welcome to Hyperbrowser, the Internet for AI. Hyperbrowser is the next-generation platform empowering AI agents and enabling effortless, scalable browser automation. Built specifically for AI developers, it eliminates the headaches of local infrastructure and performance bottlenecks, allowing you to

Any OpenAI Compatible API Integrations
Integrate Claude with Any OpenAI SDK Compatible Chat Completion API - OpenAI, Perplexity, Groq, xAI, PyroPrompts and more.
Exa MCP
A Model Context Protocol server that enables AI assistants like Claude to perform real-time web searches using the Exa AI Search API in a safe and controlled manner.
BigQuery
This is a server that lets your LLMs (like Claude) talk directly to your BigQuery data! Think of it as a friendly translator that sits between your AI assistant and your database, making sure they can chat securely and efficiently.