CodeGraph MCP Server
Enables querying and analyzing code relationships by building a lightweight graph of TypeScript and Python symbols. Supports symbol lookup, reference tracking, impact analysis from diffs, and code snippet retrieval through natural language.
README
nabi-codegraph-mcp — Minimal Code Graph + MCP Server (TS + Python)
This is a self-contained starter that builds a lightweight code graph over TypeScript and Python, then exposes it via an MCP server so agentic clients (Claude Desktop, Cursor, Copilot Studio, OpenAI Agents, Azure) can query it.
Design goals: tiny, pragmatic, easy to extend. No heavyweight LSIF/SCIP indexers required to get started (though you can integrate them later).
What you get
-
Ingestion (no build required):
- TypeScript parsed using the official
typescriptcompiler API. - Python parsed using the standard library
astmodule. - We extract symbols (functions, classes, methods, variables, per-file modules) and edges (
import,call,member_of,defines).
- TypeScript parsed using the official
-
Graph format:
- Simple JSON file at
./data/graph.jsonwithsymbols[]andedges[]. - Easy to swap for SQLite or SCIP later.
- Simple JSON file at
-
MCP Server (
code-graph):graph.resolve_symbol({ q })→ fuzzy lookup of symbols by name.graph.references({ id })→ inbound edges (who calls/imports this symbol).graph.related({ id, k })→ k neighbors (imports/calls).graph.impact_from_diff({ patch })→ changed files + 1‑hop neighbor impact set.- Resource:
code://file/{path}?s=..&e=..→ stream code snippets for context windows.
-
Example repo to test ingestion (
./examplewith TS + Py files).
Prereqs
- Node.js 20+ (recommended LTS).
- Python 3.10+ (standard library only).
Tip: This repo avoids native DB bindings for maximum portability. The graph is stored in JSON and loaded into memory by the server.
Quick start (5 minutes)
# 1) Install deps
npm install
# 2) Build a graph from the example code
npm run ingest -- --target ./example
# 3) Run the MCP server (stdio)
npm run dev:server
You should see code-graph start and announce tools.
Use with Claude Desktop (macOS/Linux/Windows)
Add an entry to your Claude Desktop config to register the MCP server via stdio.
- macOS:
~/Library/Application Support/Claude/claude_desktop_config.json - Windows:
%APPDATA%\Claude\claude_desktop_config.json - Linux:
~/.config/Claude/claude_desktop_config.json
Use the included template and replace the absolute path to this folder:
{
"mcpServers": {
"code-graph": {
"command": "node",
"args": ["/ABSOLUTE/PATH/TO/dist/mcp/server.js"],
"environment": {
"NABI_GRAPH_JSON": "/ABSOLUTE/PATH/TO/data/graph.json"
}
}
}
}
Restart Claude Desktop. In a new chat, ask it to connect to the code-graph MCP server and try tools like:
graph.resolve_symbolwith{ "q": "greet" }graph.relatedwith a returned symbolidgraph.impact_from_diffwith a pasted patch
Note: Clients differ in how they surface MCP tools. In Claude, you can view connected tools/resources in the session sidebar.
Commands & scripts
# Dev server (TypeScript via tsx)
npm run dev:server
# Compile to dist/ (pure JS ESM)
npm run build
npm start # runs the built server
# Ingest (scan target directory and build ./data/graph.json)
npm run ingest -- --target ./example
npm run ingest -- --target /path/to/your/repo
# Optional: re-run impact analysis from a diff file
cat my.patch | npm run impact
How ingestion works (tl;dr)
- TypeScript: We use the TS compiler API to walk each file’s AST, collect symbols (functions/classes/methods/variables), calls, and imports. We also create a per-file module symbol as an anchor.
- Python: A small
py/ingest_py.pyusesastto do the same. It prints NDJSON on stdout which the Node orchestrator reads and merges. - Edge resolution: Calls are matched to definitions by name with a simple heuristic (prefer same-file symbols first, otherwise the first match). This is intentionally simple—enough to bootstrap your graph and make MCP queries useful.
Later, you can plug in Tree‑sitter or SCIP for deeper, cross-repo precision.
Data model
type Range = { startLine: number; startCol: number; endLine: number; endCol: number };
type Symbol = {
id: string; kind: 'function'|'class'|'method'|'variable'|'module';
name: string; file: string; range: Range; language: 'typescript'|'python';
signature?: string; parentId?: string|null;
};
type EdgeType = 'defines'|'call'|'import'|'member_of';
type Edge = { src: string; type: EdgeType; dst: string };
type Graph = { symbols: Symbol[]; edges: Edge[] };
Roadmap: where to take it
- Add SCIP ingestion (scip-ts / scip-python) and merge edges alongside this AST path.
- Swap JSON storage for SQLite and add indexes for large monorepos.
- Add structural rewrites/codemods hooks and
graph.impact_from_diffrefinements (graph radius weighting, churn priors). - Expose a search resource:
graph://symbol?q=...that streams snippets directly.
Troubleshooting
- If the server prints “No graph loaded,” run
npm run ingestand confirm./data/graph.jsonexists. - Windows path issues? Use absolute paths in the Claude config and wrap with quotes.
- Python not found? Edit
PYTHON_BINinsrc/ingest/make_graph.tsto your interpreter path.
License: MIT
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.