GrantAi

GrantAi

Long-term Memory for AI. On Device. Secure. Coding tools, AI Agents. Instant Recall. Precise.

Category
Visit Server

README

<h1 align="center">GrantAi</h1>

<p align="center"> <strong>Infinite Memory for AI</strong><br> Local. Private. Secure. </p>

<p align="center"> <a href="https://solonai.com/grantai">Website</a> • <a href="https://solonai.com/grantai/download">Download</a> • <a href="https://solonai.com/pricing">Pricing</a> • <a href="https://solonai.com/help/grantai">Documentation</a> </p>


What is GrantAi?

GrantAi is the shared memory layer for AI agents.

Coordination frameworks are everywhere — CrewAI, AutoGen, LangGraph. But agents still lose everything when a session ends. Context windows reset. Knowledge evaporates. Each agent starts from zero.

GrantAi solves this:

  • Persistent Memory — Knowledge survives sessions, accumulates over time
  • Shared Across Agents — Multiple AI tools read and write to the same brain
  • 12ms Recall — Sub-second retrieval regardless of memory size
  • 100% Local — Your data never leaves your machine
  • AES-256 Encrypted — Secure at rest, zero data egress

Quick Start

macOS / Linux (Native)

# 1. Download from https://solonai.com/grantai/download
# 2. Extract and install
./install.sh YOUR_LICENSE_KEY

# 3. Restart your AI tool (Claude Code, Cursor, etc.)

Docker (All Platforms)

docker pull ghcr.io/solonai-com/grantai-memory:1.8.5

Add to your Claude Desktop config (~/.config/Claude/claude_desktop_config.json):

{
  "mcpServers": {
    "grantai": {
      "command": "docker",
      "args": ["run", "-i", "--rm", "--pull", "always",
               "-v", "grantai-data:/data",
               "-e", "GRANTAI_LICENSE_KEY=YOUR_KEY",
               "ghcr.io/solonai-com/grantai-memory:1.8.5"]
    }
  }
}

Supported Platforms

Platform Method Status
macOS (Apple Silicon) Native
Linux (x64) Native
Windows Docker
All Platforms Docker

MCP Tools

GrantAi provides these tools to your AI:

Tool Description
grantai_infer Query memory for relevant context
grantai_teach Store content for future recall
grantai_learn Import files or directories
grantai_health Check server status
grantai_summarize Store session summaries
grantai_project Track project state
grantai_snippet Store code patterns
grantai_git Import git commit history
grantai_capture Save conversation turns for continuity

Multi-Agent Memory Sharing

Multiple agents can share knowledge through GrantAi's memory layer.

Basic shared memory (no setup required)

# Any agent stores
grantai_teach(
    content="API rate limit is 100 requests/minute.",
    source="api-notes"
)

# Any agent retrieves
grantai_infer(input="API rate limiting")

All agents read from and write to the same memory pool. No configuration needed.

With agent attribution (optional)

Use speaker to track which agent stored what, and from_agents to filter retrieval:

# Store with identity
grantai_teach(
    content="API uses Bearer token auth.",
    source="api-research",
    speaker="researcher"  # optional
)

# Retrieve from specific agent
grantai_infer(
    input="API authentication",
    from_agents=["researcher"]  # optional filter
)

When to use speaker

Scenario Use speaker? Why
Shared knowledge base No All contributions equal, no filtering needed
Session continuity No Same context, just persist and retrieve
Research → Code handoff Yes Coder filters for researcher's findings only
Role-based trust Yes Security agent's input treated differently

Framework integration

GrantAi works with any MCP-compatible client. Point your agents at the same GrantAi instance:

{
  "mcpServers": {
    "grantai": {
      "command": "docker",
      "args": ["run", "-i", "--rm", "--pull", "always",
               "-v", "grantai-data:/data",
               "-e", "GRANTAI_LICENSE_KEY=YOUR_KEY",
               "ghcr.io/solonai-com/grantai-memory:1.8.5"]
    }
  }
}

All agents using this config share the same memory volume (grantai-data).

Pricing

  • Free Trial — 30 days, no credit card required
  • Personal — $29/month or $299/year
  • Team — $25/seat/month

View full pricing →

Documentation

Support

License

GrantAi is proprietary software. See Terms of Service.


<p align="center"> <a href="https://solonai.com/grantai">Get Started →</a> </p>

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
E2B

E2B

Using MCP to run code via e2b.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured