MCP Memory Server
A persistent local vector memory server that allows users to store and search project-specific context using LanceDB and local embeddings. It enables MCP-compliant editors to maintain long-term memory across different projects without requiring external API keys.
README
MCP Memory Server
A persistent vector memory server for Windsurf, VS Code, and other MCP-compliant editors.
π Philosophy
- Privacy-first, local-first AI memory: Your data stays on your machine.
- No vendor lock-in: Uses open standards and local files.
- Built for MCP: Designed specifically to enhance Windsurf, Cursor, and other MCP-compatible IDEs.
βΉοΈ Status (v0.1.0)
Stable:
- β Local MCP memory with Windsurf/Cursor
- β Multi-project isolation
- β Ingestion of Markdown docs
Not stable yet:
- π§ Auto-ingest (file watching)
- π§ Memory pruning
- π§ Remote sync
Note: This server uses MCP stdio transport (not HTTP) to match Windsurf/Cursorβs native MCP integration. Do not try to connect via
curl.
π₯ Health Check
To verify the server binary runs correctly:
# From within the virtual environment
python -m mcp_memory.server --help
β Quickstart (5-Minute Setup)
1. Clone and Setup
git clone https://github.com/iamjpsharma/MCPServer.git
cd MCPServer/mcp-memory-server
# Create and activate virtual environment
python3 -m venv .venv
source .venv/bin/activate
# Install dependencies
pip install -e .
2. Configure Windsurf / VS Code
Add this to your mcpServers configuration (e.g., ~/.codeium/windsurf/mcp_config.json):
Note: Replace /ABSOLUTE/PATH/TO/... with the actual full path to this directory.
{
"mcpServers": {
"memory": {
"command": "/ABSOLUTE/PATH/TO/mcp-memory-server/.venv/bin/python",
"args": ["-m", "mcp_memory.server"],
"env": {
"MCP_MEMORY_PATH": "/ABSOLUTE/PATH/TO/mcp-memory-server/mcp_memory_data"
}
}
}
}
π Usage
1. Ingestion (Adding Context)
Use the included helper script ingest.sh to add files to a specific project.
# ingest.sh <project_name> <file1> <file2> ...
# Example: Project "Thaama"
./ingest.sh project-thaama \
docs/architecture.md \
src/main.py
# Example: Project "OpenClaw"
./ingest.sh project-openclaw \
README.md \
CONTRIBUTING.md
π‘ Project ID Naming Convention
It is recommended to use a consistent prefix for your project IDs to avoid collisions:
project-thaamaproject-openclawproject-myapp
2. Connect in Editor
Once configured, the following tools will be available to the AI Assistant:
memory_search(project_id, q): Semantic search for "project-thaama", "project-openclaw", etc.memory_add(project_id, id, text): Manual addition of memory fragments.
The AI will effectively have "long-term memory" of the files you ingested.
π Troubleshooting
-
"No MCP server found" or Connection errors:
- Check the output of
pwdto ensure your absolute paths inmcp_config.jsonare 100% correct. - Ensure the virtual environment (
.venv) is created and dependencies are installed.
- Check the output of
-
"Wrong project_id used":
- The AI sometimes guesses the project ID. You can explicitly tell it: "Use project_id 'project-thaama'".
-
Embedding Model Downloads:
- On the first run, the server downloads the
all-MiniLM-L6-v2model (approx 100MB). This may cause a slight delay on the first request.
- On the first run, the server downloads the
π Repo Structure
/
βββ src/mcp_memory/
β βββ server.py # Main MCP server entry point
β βββ ingest.py # Ingestion logic
β βββ db.py # LanceDB wrapper
βββ ingest.sh # Helper script
βββ requirements.txt # Top-level dependencies
βββ pyproject.toml # Package config
βββ mcp_memory_data/ # Persistent vector storage (gitignored)
βββ README.md
πΊοΈ Roadmap
- [x] Local vector storage (LanceDB)
- [x] Multi-project isolation
- [x] Markdown ingestion
- [ ] Improved chunking strategies (semantic chunking)
- [ ] Support for PDF ingestion
- [ ] Optional HTTP transport wrapper
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.