
memU MCP Server
Enables AI applications to use advanced memory management capabilities through the memU AI framework. Supports storing conversation memories, semantic retrieval, multi-user management, and memory statistics via standardized MCP protocol.
README
memU MCP Server
A Model Context Protocol (MCP) server that provides access to memU AI memory framework capabilities.
Overview
This MCP server wraps the memU AI memory framework, enabling AI applications to use advanced memory management features through the standardized MCP protocol.
Features
- Memory Storage: Store and organize conversation memories
- Smart Retrieval: Retrieve relevant memories using semantic search
- Memory Management: Update, delete, and organize memory data
- Statistics: Get insights into memory usage and performance
- Multi-user Support: Handle multiple users and AI agents
Quick Start
Prerequisites
- Python 3.8+
- memU API key (get one at https://app.memu.so/api-key/)
Local Development
# Clone the repository
git clone <repository-url>
cd memu-mcp-server
# Install dependencies
pip install -r requirements.txt
# Set up environment variables
export MEMU_API_KEY="your-memu-api-key"
# Run the server
python -m memu_mcp_server.main
Render Deployment
# Deploy to Render (using Blueprint)
1. Connect your GitHub repository to Render
2. Render will automatically detect render.yaml
3. Set MEMU_API_KEY as a secret in Render dashboard
4. Deploy!
# Or use the Render CLI
render deploy
Usage Examples
# Local development
python -m memu_mcp_server.main --log-level DEBUG
# Render mode (for testing locally)
python -m memu_mcp_server.main --render-mode
# With custom configuration
python -m memu_mcp_server.main --config config/server.json
# API server (for health checks)
python -m memu_mcp_server.api --host 0.0.0.0 --port 8080
Configuration
- Local Development: See
config/example.json
for configuration options - Render Deployment: See Render Deployment Guide
- Environment Variables: See Environment Variables Guide
Available Tools
memorize_conversation
: Store conversation memoriesretrieve_memory
: Retrieve relevant memoriessearch_memory
: Search memories by querymanage_memory
: Update or delete memoriesget_memory_stats
: Get memory statistics
Documentation
- API Reference - Detailed API documentation
- Setup Guide - Installation and configuration
- Render Deployment - Deploy to Render platform
- Environment Variables - Configuration reference
Deployment Options
Local Development
python -m memu_mcp_server.main
Docker
docker-compose up memu-mcp-server
Render (Cloud)
Use the included render.yaml
Blueprint for one-click deployment to Render.
Claude Desktop Integration
Add to your Claude Desktop configuration:
{
"mcpServers": {
"memu-memory": {
"command": "python",
"args": ["-m", "memu_mcp_server.main"],
"env": {
"MEMU_API_KEY": "your_api_key_here"
}
}
}
}
Health Monitoring
When deployed with the Web Service component, monitoring endpoints are available:
GET /health
- Health checkGET /status
- Detailed statusGET /metrics
- Performance metricsGET /info
- Service information
Contributing
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests if applicable
- Submit a pull request
Support
- GitHub Issues: Report bugs and feature requests
- Documentation: Check the
docs/
directory - Email: support@example.com
License
MIT License
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.