Kronvex
Persistent memory API for AI agents — store, recall, and inject semantically-searchable context across sessions. EU-hosted, GDPR-compliant. Supports Claude, Cursor, Cline, and any MCP-compatible client.
README
Kronvex — EU-Native Memory API for AI Agents
Persistent, semantically searchable memory.
Three endpoints. GDPR-compliant. Data stays in Europe.
Why Kronvex?
Every time a user opens a new session with your AI agent, it starts from scratch. No context, no history, no user preferences. You end up injecting entire conversation histories into every prompt — expensive, slow, and context-window-limited.
Kronvex gives your agent persistent, semantically searchable memory across sessions. Store interactions, recall relevant context by meaning, inject a ready-to-use context block before each LLM call — and keep all data in Europe.
Performance
| Endpoint | p50 | p99 |
|---|---|---|
/remember |
<30ms | <180ms |
/recall |
<45ms | <280ms |
/inject-context |
<55ms | <320ms |
99.9% uptime · EU Frankfurt · GDPR-compliant · pgvector cosine similarity · 1536-dim embeddings
Quick Start
1. Get a free API key
curl -X POST https://api.kronvex.io/auth/demo \
-H "Content-Type: application/json" \
-d '{
"name": "Alice",
"email": "alice@company.com",
"usecase": "Customer support bot with memory"
}'
{
"full_key": "kv-xxxxxxxxxxxxxxxx",
"agent_id": "uuid-of-your-first-agent",
"memory_limit": 100,
"message": "Ready! Your API key and first agent are set up."
}
2. Store a memory
curl -X POST https://api.kronvex.io/api/v1/agents/{agent_id}/remember \
-H "X-API-Key: kv-xxxxxxxxxxxxxxxx" \
-H "Content-Type: application/json" \
-d '{"content": "Alice is a Premium customer since January 2023."}'
3. Inject context before each LLM call
curl -X POST https://api.kronvex.io/api/v1/agents/{agent_id}/inject-context \
-H "X-API-Key: kv-xxxxxxxxxxxxxxxx" \
-H "Content-Type: application/json" \
-d '{"message": "I still have that billing issue"}'
{
"context_block": "[KRONVEX CONTEXT]\n- Alice is a Premium customer since Jan 2023 (similarity: 0.94)",
"memories_used": 1
}
SDKs
Python
pip install kronvex
from kronvex import Kronvex
kx = Kronvex("kv-your-api-key")
agent = kx.agent("your-agent-id")
await agent.remember("User prefers concise answers")
context = await agent.inject_context("How should I format this?")
Node.js / TypeScript
npm install kronvex
import { Kronvex } from "kronvex";
const kx = new Kronvex("kv-your-api-key");
const agent = kx.agent("your-agent-id");
await agent.remember("User prefers concise answers");
const context = await agent.injectContext("How should I format this?");
MCP (Claude Desktop)
{
"mcpServers": {
"kronvex": {
"command": "npx",
"args": ["kronvex-mcp"],
"env": { "KRONVEX_API_KEY": "kv-your-api-key" }
}
}
}
→ Python SDK on PyPI · Node SDK on npm
How It Works
Memories are ranked by a composite confidence score:
confidence = similarity × 0.6 + recency × 0.2 + frequency × 0.2
- Similarity: pgvector cosine similarity on 1536-dim OpenAI embeddings
- Recency: sigmoid with 30-day inflection point
- Frequency: log-scaled access count
Self-Hosting
# Requires Docker
cp .env.example .env
# Edit .env with your OPENAI_API_KEY and DATABASE_URL
docker-compose up --build
API available at http://localhost:8000 · Docs at http://localhost:8000/docs
Endpoints
| Method | Endpoint | Description |
|---|---|---|
POST |
/auth/demo |
Get a free API key |
POST |
/api/v1/agents |
Create an agent |
GET |
/api/v1/agents |
List your agents |
POST |
/api/v1/agents/{id}/remember |
Store a memory |
POST |
/api/v1/agents/{id}/recall |
Semantic search over memories |
POST |
/api/v1/agents/{id}/inject-context |
Get context block |
DELETE |
/api/v1/agents/{id}/memories/{mid} |
Delete a memory |
GET |
/health |
Health check |
Full interactive docs: api.kronvex.io/docs
Pricing
| Plan | Price | Agents | Memories |
|---|---|---|---|
| Free | Free | 1 | 100 |
| Builder | €29/mo | 5 | 20,000 |
| Startup | €99/mo | 15 | 75,000 |
| Business | €349/mo | 50 | 500,000 |
| Enterprise | Custom | Unlimited | Unlimited |
Contributing
See CONTRIBUTING.md.
Built in Paris · kronvex.io · hello@kronvex.io
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.