Shelby docs MCP
A read-only MCP server that provides searchable access to the Shelby documentation bundle for MCP-compatible clients. It enables users to search, list, and read documentation pages directly within AI tools and IDEs.
README
Shelby docs MCP
A lightweight, read-only MCP server that exposes Shelby documentation as searchable tools for MCP-compatible clients like Claude Code, Codex, Cursor, VS Code, and Gemini CLI.
This project is a docs-only server. It does not write data, talk to Shelby RPC endpoints, or modify anything in the network.
Docs source
The server loads the full Shelby LLM docs bundle:
It parses that bundle into page-level chunks using the native Shelby format:
# Page Title (/path)
Page content...
# Next Page (/next-path)
Next page content...
Quickstart for Claude Code
Add the MCP server with npx:
claude mcp add --transport stdio shelby-docs -- npx -y github:Jr-kenny/shelby-mcp
Then start Claude Code:
claude
Inside Claude Code, run:
/mcp
You should see the shelby-docs server and its tool endpoints listed.
Quickstart for Cursor (per-project)
Create .cursor/mcp.json and add:
{
"mcpServers": {
"shelby-docs": {
"command": "npx",
"args": ["-y", "github:Jr-kenny/shelby-mcp"]
}
}
}
[!TIP] If Cursor does not recognize
mcpServersin your version, trymcp_serversas the top-level key instead.
Quickstart for VS Code (per-workspace)
Add this to .vscode/mcp.json:
{
"servers": {
"shelby-docs": {
"type": "stdio",
"command": "npx",
"args": ["-y", "github:Jr-kenny/shelby-mcp"]
}
},
"inputs": []
}
Quickstart for Gemini CLI
Add the MCP server globally:
gemini mcp add --scope user shelby-docs npx -y github:Jr-kenny/shelby-mcp
Confirm it is registered:
gemini mcp list
Quickstart for Codex
Add the MCP server with the Codex CLI:
codex mcp add shelby-docs -- npx -y github:Jr-kenny/shelby-mcp
Confirm it is registered:
codex mcp list
Alternatively, add this to your Codex MCP config:
[mcp_servers.shelby-docs]
command = "npx"
args = ["-y", "github:Jr-kenny/shelby-mcp"]
Then restart Codex if needed so it reloads the MCP config.
Quickstart from source
If you want to run the repository locally from source:
-
Clone the repo:
git clone https://github.com/Jr-kenny/shelby-mcp cd shelby-mcp -
Install dependencies and build:
npm install npm run build -
Run the local entrypoint:
node /absolute/path/to/shelby-mcp/dist/cli.js
Then substitute that node .../dist/cli.js command in any MCP client config if you prefer source-based usage over npx.
Repository
GitHub repository:
Tool endpoints
-
search_shelby_docsSearches the Shelby documentation bundle and returns ranked matches with IDs and snippets. -
read_shelby_docReads a page by exact path, title, URL, page ID, or fuzzy query. -
get_shelby_doc_chunkReads a specific page by the exact chunk ID returned from search results. -
list_shelby_doc_pagesLists available parsed pages and supports filtering by path or title text.
Project structure
| File/Folder | Purpose |
|---|---|
src/index.ts |
MCP server setup, tool registration, and stdio startup |
src/cli.ts |
CLI entry point that starts the server |
src/shelbyDocs.ts |
Shelby docs loading, parsing, search, and formatting helpers |
dist/ |
Compiled JavaScript output generated by npm run build |
package.json |
Dependencies, scripts, package metadata, and CLI registration |
tsconfig.json |
TypeScript compiler settings |
README.md |
Usage and setup instructions |
How it's built
This MCP server is a lightweight TypeScript implementation built on the official MCP SDK.
Core components
- Built on
@modelcontextprotocol/sdk - Uses
StdioServerTransportfor local MCP clients - Uses
zodto validate tool inputs - Fetches Shelby docs from the official
llms-full.txtbundle at startup - Parses the bundle into page chunks using Shelby's
# Title (/path)format - Uses deterministic keyword scoring over titles, paths, URLs, and body text
Configuration
Optional environment variables:
SHELBY_DOCS_URL: alternate docs bundle URLSHELBY_DOCS_TIMEOUT_MS: HTTP timeout in milliseconds, default15000
Fork this for your own docs
This repo is a good base if you want to publish other docs-only MCP servers backed by a single llms-full.txt style bundle.
-
Update
package.json:{ "name": "your-docs-mcp", "description": "Docs-only MCP server for YourProduct documentation" } -
Update the docs URL in
src/shelbyDocs.ts:const DEFAULT_DOCS_URL = "https://your-domain.com/llms-full.txt"; -
Update the server name in
src/index.ts:name: "your-docs-mcp" -
Build and publish:
npm install npm run build
Local development
This section is only for working on the MCP server itself.
npm install
npm run build
npm run check
npm start
npm run check verifies that the server can fetch and parse the live Shelby docs bundle.
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.