llms-mcp
An MCP server that exposes the llms.txt file and its referenced local or external resources from a project root to provide context for AI models. It automatically parses documentation links and URLs to make them accessible as additional MCP resources.
README
llms-mcp
An MCP (Model Context Protocol) server that exposes the llms.txt file from your project root as a resource for AI context enhancement.
Overview
llms-mcp is a MCP server that looks for the llms.txt file in the root of your project directory and exposes it as a resource that can be consumed by MCP-compatible AI clients, based on the llms.txt proposed standard.
Features
- Detection: Finds the
llms.txtfile in your project root - File Resource: Exposes the file via
file://URI for direct content access - Parsing: Extracts local file references and external URLs from llms.txt content. Automatically exposes referenced local files and external URLs as additional MCP resources.
- Fetching: Can fetch external resources on-demand.
Installation
Prerequisites
- Node.js >= 18.0.0
Setup
- Clone or download this repository
- Install dependencies:
npm install
Usage
Running the Server
Start the MCP server:
npm start
Test Mode
Validate that the server can detect your llms.txt file:
npm test
This will scan your project directory and show if an llms.txt file is detected without starting the server.
Environment Configuration
The server uses the ProjectPath environment variable to determine the root directory to scan:
export ProjectPath="/path/to/your/project"
npm start
$env:projectPath = "/path/to/your/project"; npm start
MCP Client Configuration
Claude Desktop/Cline/Roo/Kilocode
Add this configuration to your Claude Desktop config file:
{
"mcpServers": {
"llms-mcp": {
"command": "node",
"args": ["path/to/llms-mcp/src/index.js"],
"env": {
"ProjectPath": "./"
}
}
}
}
Referenced Resources
The server automatically parses the llms.txt content and exposes referenced files and URLs as additional resources:
Local Files:
- Markdown-style file links:
[text](file.ext)
External URLs:
- HTTP/HTTPS URLs:
https://example.com - URLs in markdown links:
[text](https://example.com)
All local files are validated for existence before being exposed as resources. External URLs are exposed as-is and fetched on-demand when accessed.
License
MIT License - see LICENSE file for details.
Related Projects
- llms.txt - The llms.txt standard
- Model Context Protocol - MCP specification
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.