memory-bank-mcp

memory-bank-mcp

An MCP server that gives AI assistants persistent memory across sessions. It stores project context, decisions, and progress in structured markdown files as well as a knowledge graph and sequential thinking for better memory storage.

Category
Visit Server

README

Memory Bank MCP

NPM Version Semgrep CE scan License: MIT

<a href="https://glama.ai/mcp/servers/@diaz3618/memory-bank-mcp"> <img width="380" height="200" src="https://glama.ai/mcp/servers/@diaz3618/memory-bank-mcp/badge" /> </a>

An MCP server that gives AI assistants persistent memory across sessions. It stores project context, decisions, and progress in structured markdown files — locally or on a remote server via SSH.

Related repos:

Quick Start

# Run directly (no install needed)
npx @diazstg/memory-bank-mcp

# Or install globally
npm install -g @diazstg/memory-bank-mcp

Via Smithery (Claude Desktop)

npx -y @smithery/cli install @diazstg/memory-bank-mcp --client claude

Configuration

Add to your editor's MCP config (.vscode/mcp.json, Cursor, Claude Desktop, etc.):

{
  "servers": {
    "memory-bank-mcp": {
      "command": "npx",
      "args": ["-y",
          "@diazstg/memory-bank-mcp",
          "--username",
          "your-username"
      ],
      "type": "stdio"
    }
  }
}

Tip: Including --username is highly recommended for proper progress tracking.

Common Options

npx @diazstg/memory-bank-mcp --username "github-user"   # Username for progress tracking (recommended)
npx @diazstg/memory-bank-mcp --mode code                # Set operational mode
npx @diazstg/memory-bank-mcp --path /my/project         # Custom project path
npx @diazstg/memory-bank-mcp --folder my-memory         # Custom folder name (default: memory-bank)
npx @diazstg/memory-bank-mcp --help                     # All options

Remote Server (SSH)

Store your Memory Bank on a remote server:

npx @diazstg/memory-bank-mcp --remote \
  --remote-user username \
  --remote-host example.com \
  --remote-path /home/username/memory-bank \
  --ssh-key ~/.ssh/id_ed25519

See Remote Server Guide.

How It Works

Memory Bank stores project context as markdown files in a memory-bank/ directory:

File Purpose
product-context.md Project overview, goals, tech stack
active-context.md Current state, ongoing tasks, next steps
progress.md Chronological record of updates
decision-log.md Decisions with context and rationale
system-patterns.md Architecture and code patterns

The AI assistant reads these files at the start of each session and updates them as work progresses, maintaining continuity across conversations.

MCP Tools

Tool Description
initialize_memory_bank Create a new Memory Bank
get_memory_bank_status Check current status
read_memory_bank_file Read a specific file
write_memory_bank_file Write/update a file
track_progress Add a progress entry
log_decision Record a decision
update_active_context Update current context
switch_mode Change operational mode
graph_upsert_entity Create or update a knowledge graph entity
graph_add_observation Add an observation to an entity
graph_link_entities Create a relation between entities
graph_search Search entities by name or type
graph_open_nodes Get full details of specific entities
graph_compact Compact the event log

Modes

Mode Focus
code Implementation and development
architect System design and planning
ask Q&A and information retrieval
debug Troubleshooting and diagnostics
test Testing and quality assurance

Modes can be set via CLI (--mode code), tool call (switch_mode), or .mcprules-[mode] files. See Usage Modes.

As a Library

import { MemoryBankServer } from "@diazstg/memory-bank-mcp";

const server = new MemoryBankServer();
server.run().catch(console.error);

Documentation

Topic Link
Getting Started npx usage, build with Bun, custom folder
Guides Remote server, usage modes, status system, debug MCP
Integrations VS Code/Copilot, Claude Code, Cursor, Cline, Roo Code, generic MCP
Reference MCP protocol, rules format, file naming
Development Architecture, testing, logging

Alternative: HTTP + PostgreSQL + Redis

The feature/http-postgres-redis-supabase branch provides a cloud-native variant that replaces stdio/local-filesystem with HTTP Streamable MCP transport, PostgreSQL (via Supabase) for storage, and Redis for caching. It is deployed exclusively via Docker and is not published to npm. See the branch README for setup instructions.

Contributing

See CONTRIBUTING.md.

License

See LICENSE.

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
E2B

E2B

Using MCP to run code via e2b.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured