Codebase Context
Provides AI assistants with real-time visibility into your codebase's internal libraries, team patterns, naming conventions, and usage frequencies to generate code that matches your team's actual practices.
README
codebase-context
AI coding agents don't know your codebase. This MCP fixes that.
Your team has internal libraries, naming conventions, and patterns that external AI models have never seen. This MCP server gives AI assistants real-time visibility into your codebase: which libraries your team actually uses, how often, and where to find canonical examples.
Quick Start
Add this to your MCP client config (Claude Desktop, VS Code, Cursor, etc.).
"mcpServers": {
"codebase-context": {
"command": "npx",
"args": ["codebase-context", "/path/to/your/project"]
}
}
If your environment prompts on first run, use npx --yes ... (or npx -y ...) to auto-confirm.
What You Get
- Internal library discovery →
@mycompany/ui-toolkit: 847 uses vsprimeng: 3 uses - Pattern frequencies →
inject(): 97%,constructor(): 3% - Pattern momentum →
Signals: Rising (last used 2 days ago) vsRxJS: Declining (180+ days) - Golden file examples → Real implementations showing all patterns together
- Testing conventions →
Jest: 74%,Playwright: 6% - Framework patterns → Angular signals, standalone components, etc.
- Circular dependency detection → Find toxic import cycles between files
How It Works
When generating code, the agent checks your patterns first:
| Without MCP | With MCP |
|---|---|
Uses constructor(private svc: Service) |
Uses inject() (97% team adoption) |
Suggests primeng/button directly |
Uses @mycompany/ui-toolkit wrapper |
| Generic Jest setup | Your team's actual test utilities |
Tip: Auto-invoke in your rules
Add this to your .cursorrules, CLAUDE.md, or AGENTS.md:
When generating or reviewing code, use codebase-context tools to check team patterns first.
Now the agent checks patterns automatically instead of waiting for you to ask.
Tools
| Tool | Purpose |
|---|---|
search_codebase |
Semantic + keyword hybrid search |
get_component_usage |
Find where a library/component is used |
get_team_patterns |
Pattern frequencies + canonical examples |
get_codebase_metadata |
Project structure overview |
get_indexing_status |
Indexing progress + last stats |
get_style_guide |
Query style guide rules |
detect_circular_dependencies |
Find import cycles between files |
refresh_index |
Re-index the codebase |
Configuration
| Variable | Default | Description |
|---|---|---|
EMBEDDING_PROVIDER |
transformers |
openai (fast, cloud) or transformers (local, private) |
OPENAI_API_KEY |
- | Required if provider is openai |
CODEBASE_ROOT |
- | Project root to index (CLI arg takes precedence) |
CODEBASE_CONTEXT_DEBUG |
- | Set to 1 to enable verbose logging (startup messages, analyzer registration) |
Performance Note
This tool runs locally on your machine using your hardware.
- Initial Indexing: The first run works hard. It may take several minutes (e.g., ~2-5 mins for 30k files) to compute embeddings for your entire codebase.
- Caching: Subsequent queries are instant (milliseconds).
- Updates: Currently,
refresh_indexre-scans the codebase. True incremental indexing (processing only changed files) is on the roadmap.
Links
- 📄 Motivation — Why this exists, research, learnings
- 📋 Changelog — Version history
- 🤝 Contributing — How to add analyzers
License
MIT
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.