Eternity MCP
A privacy-focused local memory server that provides long-term semantic storage and retrieval for AI agents using SQLite and ChromaDB. It enables LLMs to persist and query text, chat histories, and PDF documents across sessions through the Model Context Protocol.
README
🧠 Eternity MCP
Your Eternal Second Brain, Running Locally.
Eternity MCP is a lightweight, privacy-focused memory server designed to provide long-term memory for LLMs and AI agents using the Model Context Protocol (MCP).
It combines structured storage (SQLite) with semantic vector search (ChromaDB), enabling agents to persist and retrieve text, PDF documents, and chat histories across sessions using natural language queries.
Built to run fully locally, Eternity integrates seamlessly with MCP-compatible clients, LangChain, LangGraph, and custom LLM pipelines, giving agents a durable and private memory layer.
🚀 Why Eternity?
Building agents that "remember" is hard. Most solutions rely on expensive cloud vector databases or complex setups. Eternity solves this by being:
- 🔒 Private & Local: Runs entirely on your machine. No data leaves your network.
- ⚡ fast & Lightweight: Built on FastAPI and ChromaDB.
- 🔌 Agent-Ready: Perfect for LangGraph, LangChain, or direct LLM integration.
- 📄 Multi-Modal: Ingests raw text and PDF documents automatically.
- 🔎 Semantic Search: Finds matches by meaning, not just keywords.

📦 Installation
You can install Eternity directly from PyPI (coming soon) or from source:
# From source
git clone https://github.com/danttis/eternity-mcp.git
cd eternity
🛠️ Usage
1. Start the Server
Run the server in a terminal. It will host the API and the Memory UI.
eternity
Server runs at http://localhost:8000
2. Client Usage (Python)
You can interact with Eternity using simple HTTP requests.
import requests
ETERNITY_URL = "http://localhost:8000"
# 💾 Store a memory
requests.post("{ETERNITY_URL}/add", data={
"content": "The project deadline is next Friday.",
"tags": "work,deadline"
})
# 🔍 Search memory
response = requests.get("{ETERNITY_URL}/search", params={"q": "When is the deadline?"})
print(response.json())
3. Integration with LangGraph/AI Agents
Eternity shines when connected to an LLM. Here is a simple pattern for an agent with long-term memory:
- Recall: Before answering, search Eternity for context.
- Generate: Feed the retrieved context to the LLM.
- Memorize: Save the useful parts of the interaction back to Eternity.
(See langgraph_agent.py in the repo for a full, working example using Ollama/Groq).
🔌 API Endpoints
| Method | Endpoint | Description |
|---|---|---|
GET |
/ |
Web UI to view recent memories. |
POST |
/add |
Add text or file (PDF). Params: content, tags, file. |
GET |
/search |
Semantic search. Params: q (query text). |
🤝 Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
📜 License
This project is licensed under the MIT License - see the LICENSE file for details.
🌟 Inspiration
This project was inspired by Supermemory. We admire their vision for a second brain and their open-source spirit.
Created by Junior Dantas with a little help from AI :)
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.