URL Text Fetcher MCP Server

URL Text Fetcher MCP Server

Enables AI models to fetch text content from URLs, extract links from web pages, and search the web using Brave Search with automatic content retrieval from top results. Provides comprehensive web scraping and search capabilities with robust error handling.

Category
Visit Server

README

URL Text Fetcher MCP Server

A modern Model Context Protocol (MCP) server that provides URL text fetching, web scraping, and web search capabilities using the FastMCP framework for use with LM Studio and other MCP-compatible clients.

The server is built using the modern FastMCP framework, which provides:

  • Clean decorator-based tool definitions
  • Automatic schema generation from type hints
  • Simplified server setup and configuration
  • Better error handling and logging

All security features and functionality have been preserved while modernizing to follow MCP best practices.

Features

This MCP server enables AI models to:

  • Fetch text content from any URL by extracting all visible text
  • Extract links from web pages to discover related resources
  • Search the web using Brave Search and automatically fetch content from top results
  • Handle errors gracefully with proper timeout and exception handling

Security Features

Enterprise-grade security implementation:

  • SSRF Protection: Blocks requests to internal networks and metadata endpoints
  • Input Sanitization: Validates and cleans all URL and query inputs
  • Memory Protection: Content size limits prevent memory exhaustion
  • Rate Limiting: Thread-safe API rate limiting with configurable thresholds
  • Error Handling: Comprehensive exception handling without information leakage

Tools

The server provides three main tools:

fetch_url_text

  • Description: Downloads all visible text from a URL
  • Parameters:
    • url (string, required): The URL to fetch text from
  • Returns: Clean text content from the webpage

fetch_page_links

  • Description: Extracts all links from a web page
  • Parameters:
    • url (string, required): The URL to fetch links from
  • Returns: List of all href links found on the page

brave_search_and_fetch

  • Description: Search the web using Brave Search and automatically fetch content from the top results
  • Parameters:
    • query (string, required): The search query
    • max_results (integer, optional): Maximum number of results to fetch content for (default: 3, max: 10)
  • Returns: Search results with full text content from each result URL

Prerequisites

Brave Search API Key

To use the search functionality, you'll need a free Brave Search API key:

  1. Visit Brave Search API
  2. Sign up for a free account (2,000 queries/month, max 1 per second)
  3. Get your API key
  4. Copy .env.example to .env and add your API key:
    cp .env.example .env
    # Edit .env and set: BRAVE_API_KEY=your_actual_api_key
    

Installation

  1. Clone this repository
  2. Install dependencies:
    uv sync --dev --all-extras
    
  3. Configure your environment:
    cp .env.example .env
    # Edit .env file and set your BRAVE_API_KEY
    

Usage

With LM Studio

  1. Open LM Studio and navigate to the Integrations section
  2. Click "Install" then "Edit mcp.json"

FastMCP Implementation (Recommended)

  1. Option A: Use the configuration helper script

    ./configure_lmstudio_fastmcp.sh
    

    This will generate the correct configuration with the right paths for your system.

  2. Option B: Manual configuration - Add the server configuration:

{
  "mcpServers": {
    "url-text-fetcher-fastmcp": {
      "command": "uv",
      "args": [
        "run", 
        "url-text-fetcher-fastmcp"
      ],
      "cwd": "/Users/wallison/TechProjects/mcp-server"
    }
  }
}

Legacy Implementation (Low-Level)

For the legacy implementation:

./configure_lmstudio.sh  # Generates config for legacy server

Note: The API key will be automatically loaded from your .env file in the project directory.

  1. Save the configuration and restart LM Studio
  2. The server will appear in the Integrations section

Standalone Usage

You can also run the server directly:

# FastMCP implementation (recommended)
uv run url-text-fetcher-fastmcp

# Legacy implementation
uv run url-text-fetcher

Examples

Once configured with LM Studio, you can ask the AI to:

  • "Fetch the text content from https://example.com"
  • "Get all the links from https://news.example.com"
  • "Search for 'Python web scraping' and show me the content from the top 3 results"
  • "What's the latest news about AI? Search and get the full articles"
  • "Find information about MCP servers and fetch the detailed content"

Dependencies

  • mcp>=1.12.3 - Model Context Protocol framework
  • requests>=2.31.0 - HTTP library for web requests and Brave Search API
  • beautifulsoup4>=4.12.0 - HTML parsing and text extraction

Configuration

The server can be configured via the .env file:

# Required: Brave Search API Key
BRAVE_API_KEY=your_api_key_here

# Brave Search API Rate Limit (requests per second)
# Free tier: 1 request per second (default)
# Paid tier: 20 requests per second  
# Higher tier: 50 requests per second
# Set this to match your subscription level
BRAVE_RATE_LIMIT_RPS=1

# Optional: Request timeout in seconds (default: 10)
REQUEST_TIMEOUT=10

# Optional: Content length limit in characters (default: 5000)
CONTENT_LENGTH_LIMIT=5000

# Optional: Maximum response size in bytes (default: 10MB)
MAX_RESPONSE_SIZE=10485760

Brave Search Subscription Tiers

The server automatically adjusts its rate limiting based on your Brave Search subscription:

  • Free Tier: 1 request per second (BRAVE_RATE_LIMIT_RPS=1)
  • Paid Tier: 20 requests per second (BRAVE_RATE_LIMIT_RPS=20)
  • Higher Tier: 50 requests per second (BRAVE_RATE_LIMIT_RPS=50)

The server will enforce the configured rate limit across all concurrent requests to ensure you stay within your API quota.

See .env.example for a template.

Development

This project uses:

  • Python 3.13+
  • uv for dependency management
  • MCP SDK for protocol implementation

To set up for development:

  1. Clone the repository
  2. Run uv sync --dev --all-extras
  3. Make your changes
  4. Test with MCP-compatible clients

Troubleshooting

LM Studio Configuration Issues

If you see errors like "Failed to spawn: url-text-fetcher" in LM Studio logs:

  1. Run the configuration helper:

    ./configure_lmstudio.sh
    
  2. Make sure you're using full paths:

    • Use the full path to uv (e.g., /Users/username/.local/bin/uv)
    • Include the cwd (current working directory) in your configuration
    • Set the BRAVE_API_KEY environment variable
  3. Test the server manually:

    uv run url-text-fetcher
    

    The server should start and wait for input (press Ctrl+C to exit).

  4. Check your API key:

    # Check if your .env file has the API key set
    cat .env | grep BRAVE_API_KEY
    

    Or test manually:

    export BRAVE_API_KEY=your_actual_api_key
    echo $BRAVE_API_KEY  # Should show your key
    

Common Issues

  • "BRAVE_API_KEY environment variable is required": Make sure your .env file contains BRAVE_API_KEY=your_actual_api_key
  • "Network error": Check your internet connection and API key validity
  • "Content truncated": Normal behavior for very long web pages (content is limited to 5000 characters by default)

Error Handling

The server includes robust error handling for:

  • Network timeouts (10-second default)
  • Invalid URLs
  • HTTP errors (4xx, 5xx responses)
  • Parsing failures
  • Missing API keys
  • General exceptions

All errors are returned as descriptive text messages to help users understand what went wrong.

Development

This project uses:

  • Python 3.13+
  • uv for dependency management
  • MCP SDK for protocol implementation

To set up for development:

  1. Clone the repository
  2. Run uv sync --dev --all-extras
  3. Make your changes
  4. Test with MCP-compatible clients

Debugging

Since MCP servers run over stdio, debugging can be challenging. For the best debugging experience, we strongly recommend using the MCP Inspector.

You can launch the MCP Inspector via npm with this command:

npx @modelcontextprotocol/inspector uv --directory /Users/wallison/TechProjects/mcp-server run url-text-fetcher

Upon launching, the Inspector will display a URL that you can access in your browser to begin debugging.

License

MIT License - see LICENSE file for details

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
E2B

E2B

Using MCP to run code via e2b.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured