MCP LLMS-TXT Documentation Server

MCP LLMS-TXT Documentation Server

An open-source MCP server that provides applications like Cursor, Windsurf, and Claude with access to llms.txt documentation files, allowing users to control and audit context retrieval.

Category
Visit Server

README

MCP LLMS-TXT Documentation Server

Overview

llms.txt is a website index for LLMs, providing background information, guidance, and links to detailed markdown files. IDEs like Cursor and Windsurf or apps like Claude Code/Desktop can use llms.txt to retrieve context for tasks. However, these apps use different built-in tools to read and process files like llms.txt. The retrieval process can be opaque, and there is not always a way to audit the tool calls or the context returned.

MCP offers a way for developers to have full control over tools used by these applications. Here, we create an open source MCP server to provide MCP host applications (e.g., Cursor, Windsurf, Claude Code/Desktop) with (1) a user-defined list of llms.txt files and (2) a simple fetch_docs tool read URLs within any of the provided llms.txt files. This allows the user to audit each tool call as well as the context returned.

mcpdoc

Quickstart

Install uv

curl -LsSf https://astral.sh/uv/install.sh | sh

Choose an llms.txt file to use.

  • For example, here's the LangGraph llms.txt file.

(Optional) Test the MCP server locally with your llms.txt file of choice:

uvx --from mcpdoc mcpdoc \
    --urls LangGraph:https://langchain-ai.github.io/langgraph/llms.txt \
    --transport sse \
    --port 8082 \
    --host localhost
  • This should run at: http://localhost:8082

Screenshot 2025-03-18 at 3 29 30 PM

npx @modelcontextprotocol/inspector

Screenshot 2025-03-18 at 3 30 30 PM

  • Here, you can test the tool calls.

Connect to Cursor

  • Open Cursor Settings and MCP tab.
  • This will open the ~/.cursor/mcp.json file.

Screenshot 2025-03-19 at 11 01 31 AM

  • Paste the following into the file (we use the langgraph-docs-mcp name and link to the LangGraph llms.txt).
{
  "mcpServers": {
    "langgraph-docs-mcp": {
      "command": "uvx",
      "args": [
        "--from",
        "mcpdoc",
        "mcpdoc",
        "--urls",
        "LangGraph:https://langchain-ai.github.io/langgraph/llms.txt",
        "--transport",
        "stdio",
        "--port",
        "8081",
        "--host",
        "localhost"
      ]
    }
  }
}
  • Confirm that the server is running in your Cursor Settings/MCP tab.
  • CMD+L (on Mac) to open chat.
  • Ensure agent is selected.

Screenshot 2025-03-18 at 1 56 54 PM

Then, try an example prompt, such as:

use the langgraph-docs-mcp server to answer any LangGraph questions -- 
+ call list_doc_sources tool to get the available llms.txt file
+ call fetch_docs tool to read it
+ reflect on the urls in llms.txt 
+ reflect on the input question 
+ call fetch_docs on any urls relevant to the question
+ use this to answer the question

what are types of memory in LangGraph?

Screenshot 2025-03-18 at 1 58 38 PM

Connect to Windsurf

  • Open Cascade with CMD+L (on Mac).
  • Click Configure MCP to open the config file, ~/.codeium/windsurf/mcp_config.json.
  • Update with langgraph-docs-mcp as noted above.

Screenshot 2025-03-19 at 11 02 52 AM

  • CMD+L (on Mac) to open Cascade and refresh MCP servers.
  • Available MCP servers will be listed, showing langgraph-docs-mcp as connected.

Screenshot 2025-03-18 at 2 02 12 PM

Then, try the example prompt:

  • It will perform your tool calls.

Screenshot 2025-03-18 at 2 03 07 PM

Connect to Claude Desktop

  • Open Settings/Developer to update ~/Library/Application\ Support/Claude/claude_desktop_config.json.
  • Update with langgraph-docs-mcp as noted above.
  • Restart Claude Desktop app.

Screenshot 2025-03-18 at 2 05 54 PM

  • You will see your tools visible in the bottom right of your chat input.

Screenshot 2025-03-18 at 2 05 39 PM

Then, try the example prompt:

  • It will ask to approve tool calls as it processes your request.

Screenshot 2025-03-18 at 2 06 54 PM

Connect to Claude Code

  • In a terminal after installing Claude Code, run this command to add the MCP server to your project:
claude mcp add-json langgraph-docs '{"type":"stdio","command":"uvx" ,"args":["--from", "mcpdoc", "mcpdoc", "--urls", "langgraph:https://langchain-ai.github.io/langgraph/llms.txt"]}' -s local
  • You will see ~/.claude.json updated.
  • Test by launching Claude Code and running to view your tools:
$ Claude
$ /mcp 

Screenshot 2025-03-18 at 2 13 49 PM

Then, try the example prompt:

  • It will ask to approve tool calls.

Screenshot 2025-03-18 at 2 14 37 PM

Command-line Interface

The mcpdoc command provides a simple CLI for launching the documentation server.

You can specify documentation sources in three ways, and these can be combined:

  1. Using a YAML config file:
  • This will load the LangGraph Python documentation from the sample_config.yaml file in this repo.
mcpdoc --yaml sample_config.yaml
  1. Using a JSON config file:
  • This will load the LangGraph Python documentation from the sample_config.json file in this repo.
mcpdoc --json sample_config.json
  1. Directly specifying llms.txt URLs with optional names:
  • URLs can be specified either as plain URLs or with optional names using the format name:url.
  • This is how we loaded llms.txt for the MCP server above.
mcpdoc --urls LangGraph:https://langchain-ai.github.io/langgraph/llms.txt

You can also combine these methods to merge documentation sources:

mcpdoc --yaml sample_config.yaml --json sample_config.json --urls https://langchain-ai.github.io/langgraph/llms.txt

Additional Options

  • --follow-redirects: Follow HTTP redirects (defaults to False)
  • --timeout SECONDS: HTTP request timeout in seconds (defaults to 10.0)

Example with additional options:

mcpdoc --yaml sample_config.yaml --follow-redirects --timeout 15

This will load the LangGraph Python documentation with a 15-second timeout and follow any HTTP redirects if necessary.

Configuration Format

Both YAML and JSON configuration files should contain a list of documentation sources.

Each source must include an llms_txt URL and can optionally include a name:

YAML Configuration Example (sample_config.yaml)

# Sample configuration for mcp-mcpdoc server
# Each entry must have a llms_txt URL and optionally a name
- name: LangGraph Python
  llms_txt: https://langchain-ai.github.io/langgraph/llms.txt

JSON Configuration Example (sample_config.json)

[
  {
    "name": "LangGraph Python",
    "llms_txt": "https://langchain-ai.github.io/langgraph/llms.txt"
  }
]

Programmatic Usage

from mcpdoc.main import create_server

# Create a server with documentation sources
server = create_server(
    [
        {
            "name": "LangGraph Python",
            "llms_txt": "https://langchain-ai.github.io/langgraph/llms.txt",
        },
        # You can add multiple documentation sources
        # {
        #     "name": "Another Documentation",
        #     "llms_txt": "https://example.com/llms.txt",
        # },
    ],
    follow_redirects=True,
    timeout=15.0,
)

# Run the server
server.run(transport="stdio")

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
E2B

E2B

Using MCP to run code via e2b.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured