Fetch MCP Server

Fetch MCP Server

Enables LLMs to retrieve and process web content by fetching URLs and converting HTML to markdown format. Supports chunked reading of large pages and can access both public websites and local networks.

Category
Visit Server

README

Fetch MCP Server

A Model Context Protocol server that provides web content fetching capabilities. This server enables LLMs to retrieve and process content from web pages, converting HTML to markdown for easier consumption.

[!CAUTION] This server can access local/internal IP addresses and may represent a security risk. Exercise caution when using this MCP server to ensure this does not expose any sensitive data.

The fetch tool will truncate the response, but by using the start_index argument, you can specify where to start the content extraction. This lets models read a webpage in chunks, until they find the information they need.

Available Tools

  • fetch - Fetches a URL from the internet and extracts its contents as markdown.
    • url (string, required): URL to fetch
    • max_length (integer, optional): Maximum number of characters to return (default: 5000)
    • start_index (integer, optional): Start content from this character index (default: 0)
    • raw (boolean, optional): Get raw content without markdown conversion (default: false)

Prompts

  • fetch
    • Fetch a URL and extract its contents as markdown
    • Arguments:
      • url (string, required): URL to fetch

Installation

Optionally: Install node.js, this will cause the fetch server to use a different HTML simplifier that is more robust.

Using uv (recommended)

When using uv no specific installation is needed. We will use uvx to directly run mcp-server-fetch.

Using PIP

Alternatively you can install mcp-server-fetch via pip:

pip install mcp-server-fetch

After installation, you can run it as a script using:

python -m mcp_server_fetch

Configuration

Configure for Claude.app

Add to your Claude settings:

<details> <summary>Using uvx</summary>

{
  "mcpServers": {
    "fetch": {
      "command": "uvx",
      "args": ["mcp-server-fetch"]
    }
  }
}

</details>

<details> <summary>Using docker</summary>

{
  "mcpServers": {
    "fetch": {
      "command": "docker",
      "args": ["run", "-i", "--rm", "mcp/fetch"]
    }
  }
}

</details>

<details> <summary>Using pip installation</summary>

{
  "mcpServers": {
    "fetch": {
      "command": "python",
      "args": ["-m", "mcp_server_fetch"]
    }
  }
}

</details>

Configure for VS Code

For quick installation, use one of the one-click install buttons below...

Install with UV in VS Code Install with UV in VS Code Insiders

Install with Docker in VS Code Install with Docker in VS Code Insiders

For manual installation, add the following JSON block to your User Settings (JSON) file in VS Code. You can do this by pressing Ctrl + Shift + P and typing Preferences: Open User Settings (JSON).

Optionally, you can add it to a file called .vscode/mcp.json in your workspace. This will allow you to share the configuration with others.

Note that the mcp key is needed when using the mcp.json file.

<details> <summary>Using uvx</summary>

{
  "mcp": {
    "servers": {
      "fetch": {
        "command": "uvx",
        "args": ["mcp-server-fetch"]
      }
    }
  }
}

</details>

<details> <summary>Using Docker</summary>

{
  "mcp": {
    "servers": {
      "fetch": {
        "command": "docker",
        "args": ["run", "-i", "--rm", "mcp/fetch"]
      }
    }
  }
}

</details>

Customization - robots.txt

By default, the server will obey a websites robots.txt file if the request came from the model (via a tool), but not if the request was user initiated (via a prompt). This can be disabled by adding the argument --ignore-robots-txt to the args list in the configuration.

Customization - User-agent

By default, depending on if the request came from the model (via a tool), or was user initiated (via a prompt), the server will use either the user-agent

ModelContextProtocol/1.0 (Autonomous; +https://github.com/modelcontextprotocol/servers)

or

ModelContextProtocol/1.0 (User-Specified; +https://github.com/modelcontextprotocol/servers)

This can be customized by adding the argument --user-agent=YourUserAgent to the args list in the configuration.

Customization - Proxy

The server can be configured to use a proxy by using the --proxy-url argument.

Debugging

You can use the MCP inspector to debug the server. For uvx installations:

npx @modelcontextprotocol/inspector uvx mcp-server-fetch

Or if you've installed the package in a specific directory or are developing on it:

cd path/to/servers/src/fetch
npx @modelcontextprotocol/inspector uv run mcp-server-fetch

Contributing

We encourage contributions to help expand and improve mcp-server-fetch. Whether you want to add new tools, enhance existing functionality, or improve documentation, your input is valuable.

For examples of other MCP servers and implementation patterns, see: https://github.com/modelcontextprotocol/servers

Pull requests are welcome! Feel free to contribute new ideas, bug fixes, or enhancements to make mcp-server-fetch even more powerful and useful.

License

mcp-server-fetch is licensed under the MIT License. This means you are free to use, modify, and distribute the software, subject to the terms and conditions of the MIT License. For more details, please see the LICENSE file in the project repository.

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
E2B

E2B

Using MCP to run code via e2b.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured