Memory-IA MCP Server
Enables AI agents with persistent memory using SQLite and local LLM models through Ollama integration. Provides chat with context retention and multi-client support across VS Code, Gemini-CLI, and terminal interfaces.
README
text
Memory-IA MCP Server
Model Context Protocol (MCP) Server - Agente IA com Memória Persistente para VS Code, Gemini-CLI, Cursor e outras ferramentas.
🚀 Features
- Chat com Memória Persistente - Agente IA com contexto SQLite
- Ollama Integrado - Suporte a modelos locais (llama3.2, qwen, etc)
- JSON-RPC Protocol - Comunicação padronizada MCP
- Auto-Restart - Serviço systemd com restart automático
- Multi-Client - Funciona em VS Code, Gemini-CLI, terminal, etc
📋 Stack
- Python 3.12 com FastAPI
- LangGraph + LangChain para agentes
- SQLite para memória persistente
- Ollama para LLM local
- systemd para gerenciamento
🔧 Instalação
1. Clonar repositório
cd ~ git clone https://github.com/seu-usuario/memory-ia-mcp.git cd memory-ia-mcp
text
2. Criar ambiente virtual
python3 -m venv memorivenv source memorivenv/bin/activate
text
3. Instalar dependências
pip install -r requirements.txt
text
4. Executar MCP Server
./run_mcp.sh
text
🎯 Uso Rápido
Terminal
echo '{"jsonrpc":"2.0","method":"tools/list","id":1}' | python src/mcp_server.py
text
VS Code
- Configuração em
~/.config/Code/User/mcp.json - Abra Command Palette:
Ctrl+Shift+P - Procure por
MCP: List Servers - Selecione
memory-ia-agent
Gemini-CLI
gemini-cli --mcp-server /home/helcio/memory-ia-mcp/src/mcp_server.py
text
📡 Ferramentas Disponíveis
| Tool | Descrição |
|---|---|
memory_chat |
Chat com memória persistente |
run_ollama |
Executar modelo Ollama direto |
agent_health |
Status do agente |
🛠️ Serviço systemd
Status
sudo systemctl status memory-ia-mcp.service
text
Logs
sudo journalctl -u memory-ia-mcp -f
text
Controle
sudo systemctl restart memory-ia-mcp sudo systemctl stop memory-ia-mcp sudo systemctl start memory-ia-mcp
text
📂 Estrutura
memory-ia-mcp/ ├── src/ │ ├── mcp_server.py │ ├── agente_langgraph.py │ ├── agente_persistente.py │ └── api_agente.py ├── config/ │ └── mcp.json ├── docs/ │ └── DEVELOPMENT.md ├── tests/ │ └── test_mcp.py ├── run_mcp.sh ├── requirements.txt └── README.md
text
🔐 Configuração
Crie .env:
OLLAMA_URL=http://localhost:11434 AGENT_PORT=8000 DEBUG=False
text
📖 Documentação
🤝 Contribuições
Sinta-se livre para abrir issues e PRs!
Desenvolvido com ❤️
Última atualização: Nov 28, 2025
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.