
@lex-tools/codebase-context-dumper
A Model Context Protocol (MCP) server designed to easily dump your codebase context into Large Language Models (LLMs).
README
codebase-context-dumper MCP Server
A Model Context Protocol (MCP) server designed to easily dump your codebase context into Large Language Models (LLMs).
Why Use This?
Large context windows in LLMs are powerful, but manually selecting and formatting files from a large codebase is tedious. This tool automates the process by:
- Recursively scanning your project directory.
- Including text files from the specified directory tree that are not excluded by
.gitignore
rules. - Automatically skipping binary files.
- Concatenating the content with clear file path markers.
- Supporting chunking to handle codebases larger than the LLM's context window.
- Integrating seamlessly with MCP-compatible clients.
Usage (Recommended: npx)
The easiest way to use this tool is via npx
, which runs the latest version without needing a local installation.
Configure your MCP client (e.g., Claude Desktop, VS Code extensions) to use the following command:
{
"mcpServers": {
"codebase-context-dumper": {
"command": "npx",
"args": [
"-y",
"@lex-tools/codebase-context-dumper"
]
}
}
}
The MCP client will then be able to invoke the dump_codebase_context
tool provided by this server.
Features & Tool Details
Tool: dump_codebase_context
Recursively reads text files from a specified directory, respecting .gitignore
rules and skipping binary files. Concatenates content with file path headers/footers. Supports chunking the output for large codebases.
Functionality:
- Scans the directory provided in
base_path
. - Respects
.gitignore
files at all levels (including nested ones and.git
by default). - Detects and skips binary files.
- Reads the content of each valid text file.
- Prepends a header (
--- START: relative/path/to/file ---
) and appends a footer (--- END: relative/path/to/file ---
) to each file's content. - Concatenates all processed file contents into a single string.
Input Parameters:
base_path
(string, required): The absolute path to the project directory to scan.num_chunks
(integer, optional, default: 1): The total number of chunks to divide the output into. Must be >= 1.chunk_index
(integer, optional, default: 1): The 1-based index of the chunk to return. Requiresnum_chunks > 1
andchunk_index <= num_chunks
.
Output: Returns the concatenated (and potentially chunked) text content.
Local Installation & Usage (Advanced)
If you prefer to run a local version (e.g., for development):
- Clone the repository:
git clone git@github.com:lex-tools/codebase-context-dumper.git cd codebase-context-dumper
- Install dependencies:
npm install
- Build the server:
npm run build
- Configure your MCP client to point to the local build output:
{ "mcpServers": { "codebase-context-dumper": { "command": "/path/to/your/local/codebase-context-dumper/build/index.js" // Adjust path } } }
Contributing
Contributions are welcome! Please see CONTRIBUTING.md for details on development, debugging, and releasing new versions.
License
This project is licensed under the Apache License 2.0. See the LICENSE file for details.
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.