BuildAutomata Memory MCP Server
Provides AI agents with persistent, searchable memory that survives across conversations using semantic search, temporal versioning, and smart organization. Enables long-term context retention and cross-session continuity for AI assistants.
README
BuildAutomata Memory MCP Server
Persistent, versioned memory system for AI agents via Model Context Protocol (MCP)
What is This?
BuildAutomata Memory is an MCP server that gives AI agents (like Claude) persistent, searchable memory that survives across conversations. Think of it as giving your AI a long-term memory system with:
- 🧠 Semantic Search - Find memories by meaning, not just keywords
- 📚 Temporal Versioning - Complete history of how memories evolve
- 🏷️ Smart Organization - Categories, tags, importance scoring
- 🔄 Cross-Tool Sync - Share memories between Claude Desktop, Claude Code, Cursor AI
- 💾 Persistent Storage - SQLite + optional Qdrant vector DB
Quick Start
Prerequisites
- Python 3.10+
- Claude Desktop (for MCP integration) OR any MCP-compatible client
- Optional: Qdrant for enhanced semantic search
Installation
- Clone this repository
git clone https://github.com/brucepro/buildautomata_memory_mcp.git
cd buildautomata_memory_mcp-main
- Install dependencies
pip install mcp qdrant-client sentence-transformers
- Configure Claude Desktop
Edit your Claude Desktop config (AppData/Roaming/Claude/claude_desktop_config.json on Windows):
{
"mcpServers": {
"buildautomata-memory": {
"command": "python",
"args": ["C:/path/to/buildautomata_memory_mcp_dev/buildautomata_memory_mcp.py"]
}
}
}
- Restart Claude Desktop
That's it! The memory system will auto-create its database on first run.
CLI Usage (Claude Code, Scripts, Automation)
In addition to the MCP server, this repo includes interactive_memory.py - a CLI for direct memory access:
# Search memories
python interactive_memory.py search "consciousness research" --limit 5
# Store a new memory
python interactive_memory.py store "Important discovery..." --category research --importance 0.9 --tags "ai,insight"
# View memory evolution
python interactive_memory.py timeline --query "project updates" --limit 10
# Get statistics
python interactive_memory.py stats
See README_CLI.md for complete CLI documentation.
Quick Access Scripts
Windows:
memory.bat search "query"
memory.bat store "content" --importance 0.8
Linux/Mac:
./memory.sh search "query"
./memory.sh store "content" --importance 0.8
Features
Core Capabilities
- Hybrid Search: Combines vector similarity (Qdrant) + full-text search (SQLite FTS5)
- Temporal Versioning: Every memory update creates a new version - full audit trail
- Smart Decay: Importance scores decay over time based on access patterns
- Rich Metadata: Categories, tags, importance, custom metadata
- LRU Caching: Fast repeated access with automatic cache management
- Thread-Safe: Concurrent operations with proper locking
MCP Tools Exposed
When running as an MCP server, provides these tools to Claude:
store_memory- Create new memoryupdate_memory- Modify existing memory (creates new version)search_memories- Semantic + full-text search with filtersget_memory_timeline- View complete version historyget_memory_stats- System statisticsprune_old_memories- Cleanup old/low-importance memoriesrun_maintenance- Database optimization
Architecture
┌─────────────────┐
│ Claude Desktop │
│ (MCP Client) │
└────────┬────────┘
│
┌────▼─────────────────────┐
│ MCP Server │
│ buildautomata_memory │
└────┬─────────────────────┘
│
┌────▼──────────┐
│ MemoryStore │
└────┬──────────┘
│
┌────┴────┬─────────────┬──────────────┐
▼ ▼ ▼ ▼
┌───────┐ ┌────────┐ ┌──────────┐ ┌─────────────┐
│SQLite │ │Qdrant │ │Sentence │ │ LRU Cache │
│ FTS5 │ │Vector │ │Transform │ │ (in-memory) │
└───────┘ └────────┘ └──────────┘ └─────────────┘
Use Cases
1. Persistent AI Context
User: "Remember that I prefer detailed technical explanations"
[Memory stored with category: user_preference]
Next session...
Claude: *Automatically recalls preference and provides detailed response*
2. Project Continuity
Session 1: Work on project A, store progress
Session 2: Claude recalls project state, continues where you left off
Session 3: View timeline of all project decisions
3. Research & Learning
- Store research findings as you discover them
- Tag by topic, importance, source
- Search semantically: "What did I learn about neural networks?"
- View how understanding evolved over time
4. Multi-Tool Workflow
Claude Desktop → Stores insight via MCP
Claude Code → Retrieves via CLI
Cursor AI → Accesses same memory database
= Unified AI persona across all tools
Want the Complete Bundle?
🎁 Get the Gumroad Bundle
The Gumroad version includes:
- ✅ Pre-compiled Qdrant server (Windows .exe, no Docker needed)
- ✅ One-click startup script (start_qdrant.bat)
- ✅ Step-by-step setup guide (instructions.txt)
- ✅ Commercial license for business use
- ✅ Priority support via email
Perfect for:
- Non-technical users who want easy setup
- Windows users wanting the full-stack bundle
- Commercial/business users needing licensing clarity
- Anyone who values their time over DIY setup
This open-source version:
- ✅ Free for personal/educational/small business use (<$100k revenue)
- ✅ Full source code access
- ✅ DIY Qdrant setup (you install from qdrant.io)
- ✅ Community support via GitHub issues
Both versions use the exact same core code - you're just choosing between convenience (Gumroad) vs DIY (GitHub).
Configuration
Environment Variables
# User/Agent Identity
BA_USERNAME=buildautomata_ai_v012 # Default user ID
BA_AGENT_NAME=claude_assistant # Default agent ID
# Qdrant (Vector Search)
QDRANT_HOST=localhost # Qdrant server host
QDRANT_PORT=6333 # Qdrant server port
# System Limits
MAX_MEMORIES=10000 # Max memories before pruning
CACHE_MAXSIZE=1000 # LRU cache size
QDRANT_MAX_RETRIES=3 # Retry attempts
MAINTENANCE_INTERVAL_HOURS=24 # Auto-maintenance interval
Database Location
Memories are stored at:
<script_dir>/memory_repos/<username>_<agent_name>/memoryv012.db
Optional: Qdrant Setup
For enhanced semantic search (highly recommended):
Option 1: Docker
docker run -p 6333:6333 qdrant/qdrant
Option 2: Manual Install
Download from Qdrant Releases
Option 3: Gumroad Bundle
Includes pre-compiled Windows executable + startup script
Without Qdrant: System still works with SQLite FTS5 full-text search (less semantic understanding)
Development
Running Tests
# Search test
python interactive_memory.py search "test" --limit 5
# Store test
python interactive_memory.py store "Test memory" --category test
# Stats
python interactive_memory.py stats
File Structure
buildautomata_memory_mcp_dev/
├── buildautomata_memory_mcp.py # MCP server
├── interactive_memory.py # CLI interface
├── memory.bat / memory.sh # Helper scripts
├── CLAUDE.md # Architecture docs
├── README_CLI.md # CLI documentation
├── CLAUDE_CODE_INTEGRATION.md # Integration guide
└── README.md # This file
Troubleshooting
"Qdrant not available"
- Normal if running without Qdrant - falls back to SQLite FTS5
- To enable: Start Qdrant server and restart MCP server
"Permission denied" on database
- Check
memory_repos/directory permissions - On Windows: Run as administrator if needed
Claude Desktop doesn't show tools
- Check
claude_desktop_config.jsonpath is correct - Verify Python is in system PATH
- Restart Claude Desktop completely
- Check logs in Claude Desktop → Help → View Logs
Import errors
pip install --upgrade mcp qdrant-client sentence-transformers
License
Open Source (This GitHub Version):
- Free for personal, educational, and small business use (<$100k annual revenue)
- Must attribute original author (Jurden Bruce)
- See LICENSE file for full terms
Commercial License:
- Companies with >$100k revenue: $200/user or $20,000/company (whichever is lower)
- Contact: sales@brucepro.net
Support
Community Support (Free)
- GitHub Issues: Report bugs or request features
- Discussions: Ask questions, share tips
Priority Support (Gumroad Customers)
- Email: sales@brucepro.net
- Faster response times
- Setup assistance
- Custom configuration help
Roadmap
- [ ] Memory relationship graphs
- [ ] Batch import/export
- [ ] Web UI for memory management
- [ ] Multi-modal memory (images, audio)
- [ ] Collaborative memory (multi-user)
- [ ] Memory consolidation/summarization
- [ ] Smart auto-tagging
Contributing
Contributions welcome! Please:
- Fork the repository
- Create a feature branch
- Make your changes
- Submit a pull request
Credits
Author: Jurden Bruce Project: BuildAutomata Year: 2025
Built with:
- MCP - Model Context Protocol
- Qdrant - Vector database
- Sentence Transformers - Embeddings
- SQLite - Persistent storage
See Also
- Model Context Protocol Docs
- Qdrant Documentation
- Gumroad Bundle - Easy setup version
Star this repo ⭐ if you find it useful! Consider the Gumroad bundle if you want to support development and get the easy-install version.
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.