opensrc-mcp
A codemode MCP server for fetching and querying dependency source code from npm, PyPI, crates.io, and GitHub. It allows agents to execute server-side JavaScript for context-efficient searching and browsing of large codebases without overwhelming the LLM's context window.
README
opensrc-mcp
A codemode MCP server for fetching and querying dependency source code.
Why?
Traditional MCP exposes tools directly to LLMs. This server uses the codemode pattern: agents write JavaScript that executes server-side, and only results return. Benefits:
- Context efficient - Large source trees stay server-side
- Batch operations - One call to search/read multiple files
- LLMs are better at code - More training data for JS than tool-calling
Installation
npm install -g opensrc-mcp
# or
npx opensrc-mcp
OpenCode Configuration
Add to your OpenCode config (~/.config/opencode/config.json or project opencode.json):
{
"mcp": {
"opensrc": {
"type": "local",
"command": "npx",
"args": ["-y", "opensrc-mcp"]
}
}
}
Tool
execute
Single tool exposing all operations. Agents write JS that runs server-side; only results return.
// Available in sandbox:
declare const opensrc: {
// Read operations
list(): Source[];
has(name: string, version?: string): boolean;
get(name: string): Source | undefined;
files(sourceName: string, glob?: string): Promise<FileEntry[]>;
tree(sourceName: string, options?: { depth?: number }): Promise<TreeNode>;
grep(pattern: string, options?: {
sources?: string[];
include?: string;
maxResults?: number;
}): Promise<GrepResult[]>;
astGrep(sourceName: string, pattern: string, options?: {
glob?: string;
lang?: string | string[];
limit?: number;
}): Promise<AstGrepMatch[]>;
read(sourceName: string, filePath: string): Promise<string>;
readMany(sourceName: string, paths: string[]): Promise<Record<string, string>>;
resolve(spec: string): Promise<ParsedSpec>;
// Mutation operations
fetch(specs: string | string[], options?: { modify?: boolean }): Promise<FetchedSource[]>;
remove(names: string[]): Promise<RemoveResult>;
clean(options?: {
packages?: boolean;
repos?: boolean;
npm?: boolean;
pypi?: boolean;
crates?: boolean;
}): Promise<RemoveResult>;
};
declare const sources: Source[]; // All fetched sources
declare const cwd: string; // Project directory
Examples:
// List all fetched sources
async () => opensrc.list()
// Fetch npm package (auto-detects version from lockfile)
async () => opensrc.fetch("zod")
// Fetch multiple packages
async () => opensrc.fetch(["zod", "drizzle-orm", "hono"])
// Fetch GitHub repo at specific ref
async () => opensrc.fetch("vercel/ai@v3.0.0")
// Fetch from other registries
async () => opensrc.fetch("pypi:requests")
async () => opensrc.fetch("crates:serde")
// Get directory tree
async () => opensrc.tree("zod", { depth: 2 })
// Find TypeScript files
async () => opensrc.files("zod", "**/*.ts")
// Text search
async () => opensrc.grep("parse", { sources: ["zod"], include: "*.ts" })
// AST search (structural pattern matching)
async () => opensrc.astGrep("zod", "function $NAME($$$ARGS)", { glob: "**/*.ts" })
// Read a specific file
async () => opensrc.read("zod", "src/index.ts")
// Read multiple files (supports globs)
async () => opensrc.readMany("zod", ["src/index.ts", "packages/*/package.json"])
// Remove a source
async () => opensrc.remove(["zod"])
// Clean all npm packages
async () => opensrc.clean({ npm: true })
Package Formats
| Format | Example | Description |
|---|---|---|
<name> |
zod |
npm (auto-detects version) |
<name>@<version> |
zod@3.22.0 |
npm specific version |
npm:<name> |
npm:react |
explicit npm |
pypi:<name> |
pypi:requests |
Python/PyPI |
pip:<name> |
pip:flask |
alias for pypi |
crates:<name> |
crates:serde |
Rust/crates.io |
cargo:<name> |
cargo:tokio |
alias for crates |
owner/repo |
vercel/ai |
GitHub repo |
owner/repo@ref |
vercel/ai@v1.0.0 |
GitHub at ref |
github:owner/repo |
github:facebook/react |
explicit GitHub |
Storage
Sources are stored globally at ~/.local/share/opensrc/ (XDG compliant):
~/.local/share/opensrc/
├── sources.json # Index of fetched sources
├── packages/ # npm/pypi/crates packages
│ └── zod/
│ ├── src/
│ ├── package.json
│ └── ...
└── repos/ # GitHub repos
└── github.com/
└── vercel/
└── ai/
Override with $OPENSRC_DIR or $XDG_DATA_HOME.
How It Works
- Agent calls
executetool with JS code:async () => opensrc.fetch("zod") - Code runs in sandboxed
vmcontext with injectedopensrcAPI - Server fetches package via opensrc (handles registry lookup, git clone)
- Only the result returns to agent context
┌─────────────────────────────────────────────────────────────┐
│ Agent Context │
├─────────────────────────────────────────────────────────────┤
│ Tool call: execute({ code: "async () => opensrc.fetch..." })│
│ ↓ │
│ Result: { success: true, source: { name: "zod", ... } } │
└─────────────────────────────────────────────────────────────┘
↕
┌─────────────────────────────────────────────────────────────┐
│ opensrc-mcp Server │
├─────────────────────────────────────────────────────────────┤
│ Sandbox executes code with injected opensrc API │
│ Full source tree stays here, never sent to agent │
└─────────────────────────────────────────────────────────────┘
License
MIT
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
E2B
Using MCP to run code via e2b.