LeanKG

LeanKG

LeanKG: Stop Burning Tokens. Start Coding Lean.

Category
Visit Server

README

<p align="center"> <img src="https://www.leankg.com/icon.svg" alt="LeanKG" width="80" height="80"> </p>

LeanKG

License: MIT Rust crates.io

Lightweight Knowledge Graph for AI-Assisted Development

LeanKG is a local-first knowledge graph that gives AI coding tools accurate codebase context. It indexes your code, builds dependency graphs, and exposes an MCP server so tools like Cursor, OpenCode, and Claude Code can query the knowledge graph directly. No cloud services, no external databases.

Visualize your knowledge graph with force-directed layout, WebGL rendering, and community clustering.

LeanKG Graph Visualization LeanKG Obsidian

See docs/web-ui.md for more features.


Live Demo

Try LeanKG without installing: https://leankg.onrender.com

leankg web --port 9000

Installation

One-Line Install (Recommended)

curl -fsSL https://raw.githubusercontent.com/FreePeak/LeanKG/main/scripts/install.sh | bash -s -- <target>

Supported targets:

Target AI Tool Auto-Installed
opencode OpenCode AI Binary + MCP + Plugin + Skill + AGENTS.md
cursor Cursor AI Binary + MCP + Skill + AGENTS.md + Session Hook
claude Claude Code Binary + MCP + Plugin + Skill + CLAUDE.md + Session Hook
gemini Gemini CLI Binary + MCP + Skill + GEMINI.md
kilo Kilo Code Binary + MCP + Skill + AGENTS.md
antigravity Google Antigravity Binary + MCP + Skill + GEMINI.md

Examples:

curl -fsSL https://raw.githubusercontent.com/FreePeak/LeanKG/main/scripts/install.sh | bash -s -- cursor
curl -fsSL https://raw.githubusercontent.com/FreePeak/LeanKG/main/scripts/install.sh | bash -s -- claude

Install via Cargo or Build from Source

cargo install leankg && leankg --version
git clone https://github.com/FreePeak/LeanKG.git && cd LeanKG && cargo build --release

Quick Start

leankg init                              # Initialize LeanKG in your project
leankg index ./src                        # Index your codebase
leankg watch ./src                        # Auto-index on file changes
leankg impact src/main.rs --depth 3       # Calculate blast radius
leankg status                             # Check index status
leankg metrics                            # View token savings
leankg web                                # Start Web UI at http://localhost:8080

See docs/cli-reference.md for all commands.


How LeanKG Helps

graph LR
    subgraph "Without LeanKG"
        A1[AI Tool] -->|Scans entire codebase| B1[10,000+ tokens]
        B1 --> A1
    end

    subgraph "With LeanKG"
        A2[AI Tool] -->|13-42 tokens| C[LeanKG Graph]
        C -->|Targeted subgraph| A2
    end

Without LeanKG: AI scans entire codebase (~10,000+ tokens). With LeanKG: AI queries knowledge graph for targeted context (13-42 tokens). 98% token saving for impact analysis.


Highlights

  • Auto-Init -- Install script configures MCP, rules, skills, and hooks automatically
  • Auto-Trigger -- Session hooks inject LeanKG context into every AI tool session
  • Token Concise -- 13-42 tokens per query vs 10,000+ for full codebase scan
  • Token Saving -- Up to 98% token reduction for impact analysis
  • Impact Radius -- Compute blast radius before making changes
  • Dependency Graph -- Build call graphs with IMPORTS, CALLS, TESTED_BY edges
  • MCP Server -- Expose graph via MCP protocol for AI tool integration
  • Multi-Language -- Index Go, TypeScript, Python, Rust, Java, Kotlin with tree-sitter

See docs/architecture.md for system design and data model details.


Supported AI Tools

Tool Auto-Setup Session Hook Plugin
Cursor Yes session-start -
Claude Code Yes session-start Yes
OpenCode Yes - Yes
Kilo Code Yes - -
Gemini CLI Yes - -
Google Antigravity Yes - -
Codex Yes - -

Note: Cursor requires per-project installation. The AI features work on a per-workspace basis, so LeanKG should be installed in each project directory where you want AI context injection.

See docs/agentic-instructions.md for detailed setup and auto-trigger behavior.


Context Metrics

Track token savings to understand LeanKG's efficiency.

leankg metrics --json              # View with JSON output
leankg metrics --since 7d           # Filter by time
leankg metrics --tool search_code   # Filter by tool

See docs/metrics.md for schema and examples.


Update

# Check current version
leankg version

# Update LeanKG binary via install script
curl -fsSL https://raw.githubusercontent.com/FreePeak/LeanKG/main/scripts/install.sh | bash -s -- update

Documentation

Doc Description
docs/cli-reference.md All CLI commands
docs/mcp-tools.md MCP tools reference
docs/agentic-instructions.md AI tool setup & auto-trigger
docs/architecture.md System design, data model
docs/web-ui.md Web UI features
docs/metrics.md Metrics schema & examples
docs/benchmark.md Performance benchmarks
docs/roadmap.md Feature planning
docs/tech-stack.md Tech stack & structure

Requirements

  • Rust 1.70+
  • macOS or Linux

License

MIT


Star History

<a href="https://www.star-history.com/?repos=FreePeak%2FLeanKG&type=date&legend=top-left"> <picture> <source media="(prefers-color-scheme: dark)" srcset="https://api.star-history.com/chart?repos=FreePeak/LeanKG&type=date&theme=dark&legend=top-left" /> <source media="(prefers-color-scheme: light)" srcset="https://api.star-history.com/chart?repos=FreePeak/LeanKG&type=date&legend=top-left" /> <img alt="Star History Chart" src="https://api.star-history.com/chart?repos=FreePeak/LeanKG&type=date&legend=top-left" /> </picture> </a>

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured
E2B

E2B

Using MCP to run code via e2b.

Official
Featured