MCP Memory Server

MCP Memory Server

A persistent local vector memory server that allows users to store and search project-specific context using LanceDB and local embeddings. It enables MCP-compliant editors to maintain long-term memory across different projects without requiring external API keys.

Category
Visit Server

README

MCP Memory Server

License Python Release

A persistent vector memory server for Windsurf, VS Code, and other MCP-compliant editors.

🌟 Philosophy

  • Privacy-first, local-first AI memory: Your data stays on your machine.
  • No vendor lock-in: Uses open standards and local files.
  • Built for MCP: Designed specifically to enhance Windsurf, Cursor, and other MCP-compatible IDEs.

ℹ️ Status (v0.1.0)

Stable:

  • βœ… Local MCP memory with Windsurf/Cursor
  • βœ… Multi-project isolation
  • βœ… Ingestion of Markdown docs

Not stable yet:

  • 🚧 Auto-ingest (file watching)
  • 🚧 Memory pruning
  • 🚧 Remote sync

Note: This server uses MCP stdio transport (not HTTP) to match Windsurf/Cursor’s native MCP integration. Do not try to connect via curl.

πŸ₯ Health Check

To verify the server binary runs correctly:

# From within the virtual environment
python -m mcp_memory.server --help

βœ… Quickstart (5-Minute Setup)

1. Clone and Setup

git clone https://github.com/iamjpsharma/MCPServer.git
cd MCPServer/mcp-memory-server

# Create and activate virtual environment
python3 -m venv .venv
source .venv/bin/activate

# Install dependencies
pip install -e .

2. Configure Windsurf / VS Code

Add this to your mcpServers configuration (e.g., ~/.codeium/windsurf/mcp_config.json):

Note: Replace /ABSOLUTE/PATH/TO/... with the actual full path to this directory.

{
  "mcpServers": {
    "memory": {
      "command": "/ABSOLUTE/PATH/TO/mcp-memory-server/.venv/bin/python",
      "args": ["-m", "mcp_memory.server"],
      "env": {
        "MCP_MEMORY_PATH": "/ABSOLUTE/PATH/TO/mcp-memory-server/mcp_memory_data"
      }
    }
  }
}

πŸš€ Usage

1. Ingestion (Adding Context)

Use the included helper script ingest.sh to add files to a specific project.

# ingest.sh <project_name> <file1> <file2> ...

# Example: Project "Thaama"
./ingest.sh project-thaama \
  docs/architecture.md \
  src/main.py

# Example: Project "OpenClaw"
./ingest.sh project-openclaw \
  README.md \
  CONTRIBUTING.md

πŸ’‘ Project ID Naming Convention

It is recommended to use a consistent prefix for your project IDs to avoid collisions:

  • project-thaama
  • project-openclaw
  • project-myapp

2. Connect in Editor

Once configured, the following tools will be available to the AI Assistant:

  • memory_search(project_id, q): Semantic search for "project-thaama", "project-openclaw", etc.
  • memory_add(project_id, id, text): Manual addition of memory fragments.

The AI will effectively have "long-term memory" of the files you ingested.

πŸ›  Troubleshooting

  • "No MCP server found" or Connection errors:

    • Check the output of pwd to ensure your absolute paths in mcp_config.json are 100% correct.
    • Ensure the virtual environment (.venv) is created and dependencies are installed.
  • "Wrong project_id used":

    • The AI sometimes guesses the project ID. You can explicitly tell it: "Use project_id 'project-thaama'".
  • Embedding Model Downloads:

    • On the first run, the server downloads the all-MiniLM-L6-v2 model (approx 100MB). This may cause a slight delay on the first request.

πŸ“ Repo Structure

/
β”œβ”€β”€ src/mcp_memory/
β”‚   β”œβ”€β”€ server.py       # Main MCP server entry point
β”‚   β”œβ”€β”€ ingest.py       # Ingestion logic
β”‚   └── db.py           # LanceDB wrapper
β”œβ”€β”€ ingest.sh           # Helper script
β”œβ”€β”€ requirements.txt    # Top-level dependencies
β”œβ”€β”€ pyproject.toml      # Package config
β”œβ”€β”€ mcp_memory_data/    # Persistent vector storage (gitignored)
└── README.md

πŸ—ΊοΈ Roadmap

  • [x] Local vector storage (LanceDB)
  • [x] Multi-project isolation
  • [x] Markdown ingestion
  • [ ] Improved chunking strategies (semantic chunking)
  • [ ] Support for PDF ingestion
  • [ ] Optional HTTP transport wrapper

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
E2B

E2B

Using MCP to run code via e2b.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured