crawl4ai-mcp

crawl4ai-mcp

An MCP Server for Web scraping and Crawling, built using Crawl4AI

Category
Visit Server

README

Crawl4AI MCP Server

A Model Context Protocol (MCP) server implementation that integrates Crawl4AI with Cursor AI, providing web scraping and crawling capabilities as tools for LLMs in Cursor Composer's agent mode.

System Requirements

Python 3.10 or higher installed.

Current Features

  • Single page scraping
  • Website crawling

Installation

Basic setup instructions also available in the Official Docs for MCP Server QuickStart.

Set up your environment

First, let's install uv and set up our Python project and environment:

MacOS/Linux:

curl -LsSf https://astral.sh/uv/install.sh | sh

Windows:

powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"

Make sure to restart your terminal afterwards to ensure that the uv command gets picked up.

After that:

  1. Clone the repository

  2. Install dependencies using UV:

# Navigate to the crawl4ai-mcp directory
cd crawl4ai-mcp

# Install dependencies (Only first time)
uv venv
uv sync

# Activate the venv
source .venv/bin/activate

# Run the server
python main.py
  1. Add to Cursor's MCP Servers or Claude's MCP Servers

You may need to put the full path to the uv executable in the command field. You can get this by running which uv on MacOS/Linux or where uv on Windows.

{
  "mcpServers": {
    "Crawl4AI": {
      "command": "uv",
      "args": [
        "--directory",
        "/ABSOLUTE/PATH/TO/PARENT/FOLDER/crawl4ai-mcp",
        "run",
        "main.py"
      ]
    }
  }
}

Tools Provided

This MCP server exposes the following tools to the LLM:

  1. scrape_webpage(url: str)

    • Description: Scrapes the content and metadata from a single webpage using Crawl4AI.
    • Parameters:
      • url (string, required): The URL of the webpage to scrape.
    • Returns: A list containing a TextContent object with the scraped content (primarily markdown) as JSON.
  2. crawl_website(url: str, crawl_depth: int = 1, max_pages: int = 5)

    • Description: Crawls a website starting from the given URL up to a specified depth and page limit using Crawl4AI.
    • Parameters:
      • url (string, required): The starting URL to crawl.
      • crawl_depth (integer, optional, default: 1): The maximum depth to crawl relative to the starting URL.
      • max_pages (integer, optional, default: 5): The maximum number of pages to scrape during the crawl.
    • Returns: A list containing a TextContent object with a JSON array of results for the crawled pages (including URL, success status, markdown content, or error).

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
E2B

E2B

Using MCP to run code via e2b.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured