MCPMem

MCPMem

Enables AI assistants to store and retrieve memories with semantic search capabilities using vector embeddings. Provides persistent memory storage with SQLite backend for context retention across conversations.

Category
Visit Server

README

MCPMem

NPM Version Static Badge NPM License

TypeScript Node.js OpenAI

A robust Model Context Protocol (MCP) tool for storing and searching memories with semantic search capabilities using SQLite and embeddings.

Author

Jay Simons - https://yaa.bz

Features

  • 🧠 Memory Storage: Store text-based memories with metadata
  • 🔍 Semantic Search: Find memories by meaning, not just keywords
  • Vector Embeddings: Uses OpenAI's embedding models for semantic understanding
  • 🗄️ SQLite Backend: Lightweight, local database with vector search capabilities
  • 🔧 MCP Compatible: Works with any MCP-compatible AI assistant
  • 💻 CLI Tools: Full command-line interface for memory management
  • 📦 Easy Installation: Install via npm and start using immediately
  • ⚙️ Flexible Config: Use config files or environment variables

Installation

Global Installation (Recommended)

npm install -g mcpmem@latest

Quick Start

Option 1: Using Environment Variables (Simplest)

# Set your API key
export OPENAI_API_KEY=sk-your-openai-api-key-here

# Optional: Customize model and database path
export OPENAI_MODEL=text-embedding-3-small
export MCPMEM_DB_PATH=/path/to/memories.db

# Test the configuration
mcpmem test

# Start using the CLI or MCP server
mcpmem stats

Option 2: Using Configuration File

  1. Initialize configuration:

    mcpmem init
    

    This creates mcpmem.config.json and updates .gitignore.

  2. Edit the configuration file and add your OpenAI API key:

    {
      "embedding": {
        "provider": "openai", 
        "apiKey": "your-openai-api-key-here",
        "model": "text-embedding-3-small"
      },
      "database": {
        "path": "./mcpmem.db"
      }
    }
    
  3. Test the configuration:

    mcpmem test
    

CLI Usage

MCPMem provides a comprehensive command-line interface for managing memories:

📝 Storing Memories

# Store a simple memory
mcpmem store "Remember to review the quarterly reports"

# Store memory with metadata
mcpmem store "API endpoint returns 500 errors" -m '{"project":"web-app","severity":"high"}'

🔍 Searching Memories

# Semantic search 
mcpmem search "database issues"

# Custom limits and thresholds
mcpmem search "bugs" --limit 5 --threshold 0.8

📋 Listing Memories

# Show recent memories
mcpmem list

# Show more memories
mcpmem list --limit 50

🔍 Getting Specific Memory

# Get memory details by ID
mcpmem get abc123-def456-789

🗑️ Deleting Memories

# Delete with confirmation
mcpmem delete abc123-def456-789

# Force delete (no confirmation)
mcpmem delete abc123-def456-789 --force

# Clear all memories (with confirmation)
mcpmem clear

# Force clear all memories (no confirmation)
mcpmem clear --force

📊 Database Info

# Show database statistics
mcpmem stats

# Show database file location and details
mcpmem ls_db

📚 Help

# Show all available commands
mcpmem --help

# Show detailed examples and usage
mcpmem help-commands

# Get help for a specific command
mcpmem search --help

MCP Server Usage

Using with Cursor/Claude Desktop

Add to your MCP configuration file:

With Environment Variables (Recommended)

{
  "mcpServers": {
    "mcpmem": {
      "command": "mcpmem",
      "env": {
        "OPENAI_API_KEY": "your-openai-api-key-here",
        "OPENAI_MODEL": "text-embedding-3-small",
        "MCPMEM_DB_PATH": "/path/to/memories.db"
      }
    }
  }
}

Available MCP Tools

When running as an MCP server, the following tools are available:

  • store_memory: Store a new memory with optional metadata
  • search_memories: Search memories using semantic similarity
  • get_memory: Retrieve a specific memory by ID
  • get_all_memories: Get all memories (most recent first)
  • update_memory: Update an existing memory
  • delete_memory: Delete a memory by ID
  • get_memory_stats: Get statistics about the memory database
  • get_version: Get the version of mcpmem
  • ls_db: Show database file location and details
  • clear_all_memories: Delete all memories from the database

Examples

CLI Examples

# Store project-related memories
mcpmem store "Fixed the authentication bug in user login" -m '{"project":"web-app","type":"bug-fix"}'
mcpmem store "Meeting notes: Discussed Q4 roadmap priorities" -m '{"type":"meeting","quarter":"Q4"}'

# Search for memories
mcpmem search "authentication issues"
mcpmem search "meeting" --limit 3

# Manage memories
mcpmem list --limit 10
mcpmem get memory-id-here
mcpmem delete old-memory-id --force
mcpmem clear --force

MCP Usage Examples

When connected to an MCP-compatible assistant:

Assistant: I'll help you store that memory about the bug fix.

*Uses store_memory tool*
- Content: "Fixed authentication timeout issue in production"
- Metadata: {"severity": "high", "environment": "production"}

Memory stored successfully with ID: abc123-def456
Assistant: Let me search for previous issues related to authentication.

*Uses search_memories tool with query "authentication problems"*

Found 3 related memories:
1. Fixed authentication timeout issue (similarity: 85%)
2. Updated auth middleware configuration (similarity: 78%)
3. Resolved login redirect bug (similarity: 72%)

Development

Building

# Install dependencies
pnpm install

# Build the project
pnpm build

# Type checking
pnpm tc

Testing

# Run tests
pnpm test

# Test configuration
mcpmem test

Database

MCPMem uses SQLite with the sqlite-vec extension for vector similarity search. The database schema includes:

  • memories: Stores memory content, metadata, and timestamps
  • embeddings: Stores vector embeddings for semantic search

The database file is created automatically and includes proper indexing for fast retrieval.

Supported Embedding Models

Currently supports OpenAI embedding models:

  • text-embedding-3-small (1536 dimensions, default)
  • text-embedding-3-large (3072 dimensions)
  • text-embedding-ada-002 (1536 dimensions, legacy)

Troubleshooting

Common Issues

  1. "OPENAI_API_KEY environment variable is required"

    • Set the environment variable: export OPENAI_API_KEY=sk-...
    • Or add it to your mcpmem.config.json file
  2. "Could not determine executable to run" (with npx)

    • The package might not be published yet
    • Use local installation instead: npm install -g /path/to/mcpmem
  3. Database permission errors

    • Ensure the directory for the database path exists and is writable
    • MCPMem automatically creates parent directories
  4. Vector search not working

    • Ensure you have a valid OpenAI API key
    • Check that embeddings are being generated: mcpmem stats

Debug Commands

# Check configuration and connectivity
mcpmem test

# View database statistics
mcpmem stats

# List recent memories to verify storage
mcpmem list --limit 5

License

MIT

Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes
  4. Add tests if applicable
  5. Submit a pull request

For more information and updates, visit the GitHub repository.

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
E2B

E2B

Using MCP to run code via e2b.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured