MCP Server Firecrawl
A server that provides web scraping and intelligent content searching capabilities using the Firecrawl API, enabling AI agents to extract structured data from websites and perform content searches.
Msparihar
Tools
crawl
Crawls a website starting from a base URL
map
Maps a website's structure
extract
Extracts structured data from URLs
scrape_url
Scrape content from a URL using Firecrawl API
search_content
Search content using Firecrawl API
README
Firecrawl MCP Server
A Model Context Protocol (MCP) server for web scraping, content searching, site crawling, and data extraction using the Firecrawl API.
Features
-
Web Scraping: Extract content from any webpage with customizable options
- Mobile device emulation
- Ad and popup blocking
- Content filtering
- Structured data extraction
- Multiple output formats
-
Content Search: Intelligent search capabilities
- Multi-language support
- Location-based results
- Customizable result limits
- Structured output formats
-
Site Crawling: Advanced web crawling functionality
- Depth control
- Path filtering
- Rate limiting
- Progress tracking
- Sitemap integration
-
Site Mapping: Generate site structure maps
- Subdomain support
- Search filtering
- Link analysis
- Visual hierarchy
-
Data Extraction: Extract structured data from multiple URLs
- Schema validation
- Batch processing
- Web search enrichment
- Custom extraction prompts
Installation
# Global installation
npm install -g @modelcontextprotocol/mcp-server-firecrawl
# Local project installation
npm install @modelcontextprotocol/mcp-server-firecrawl
Quick Start
-
Get your Firecrawl API key from the developer portal
-
Set your API key:
Unix/Linux/macOS (bash/zsh):
export FIRECRAWL_API_KEY=your-api-key
Windows (Command Prompt):
set FIRECRAWL_API_KEY=your-api-key
Windows (PowerShell):
$env:FIRECRAWL_API_KEY = "your-api-key"
Alternative: Using .env file (recommended for development):
# Install dotenv npm install dotenv # Create .env file echo "FIRECRAWL_API_KEY=your-api-key" > .env
Then in your code:
import dotenv from 'dotenv'; dotenv.config();
-
Run the server:
mcp-server-firecrawl
Integration
Claude Desktop App
Add to your MCP settings:
{
"firecrawl": {
"command": "mcp-server-firecrawl",
"env": {
"FIRECRAWL_API_KEY": "your-api-key"
}
}
}
Claude VSCode Extension
Add to your MCP configuration:
{
"mcpServers": {
"firecrawl": {
"command": "mcp-server-firecrawl",
"env": {
"FIRECRAWL_API_KEY": "your-api-key"
}
}
}
}
Usage Examples
Web Scraping
// Basic scraping
{
name: "scrape_url",
arguments: {
url: "https://example.com",
formats: ["markdown"],
onlyMainContent: true
}
}
// Advanced extraction
{
name: "scrape_url",
arguments: {
url: "https://example.com/blog",
jsonOptions: {
prompt: "Extract article content",
schema: {
title: "string",
content: "string"
}
},
mobile: true,
blockAds: true
}
}
Site Crawling
// Basic crawling
{
name: "crawl",
arguments: {
url: "https://example.com",
maxDepth: 2,
limit: 100
}
}
// Advanced crawling
{
name: "crawl",
arguments: {
url: "https://example.com",
maxDepth: 3,
includePaths: ["/blog", "/products"],
excludePaths: ["/admin"],
ignoreQueryParameters: true
}
}
Site Mapping
// Generate site map
{
name: "map",
arguments: {
url: "https://example.com",
includeSubdomains: true,
limit: 1000
}
}
Data Extraction
// Extract structured data
{
name: "extract",
arguments: {
urls: ["https://example.com/product1", "https://example.com/product2"],
prompt: "Extract product details",
schema: {
name: "string",
price: "number",
description: "string"
}
}
}
Configuration
See configuration guide for detailed setup options.
API Documentation
See API documentation for detailed endpoint specifications.
Development
# Install dependencies
npm install
# Build
npm run build
# Run tests
npm test
# Start in development mode
npm run dev
Examples
Check the examples directory for more usage examples:
- Basic scraping: scrape.ts
- Crawling and mapping: crawl-and-map.ts
Error Handling
The server implements robust error handling:
- Rate limiting with exponential backoff
- Automatic retries
- Detailed error messages
- Debug logging
Security
- API key protection
- Request validation
- Domain allowlisting
- Rate limiting
- Safe error messages
Contributing
See CONTRIBUTING.md for contribution guidelines.
License
MIT License - see LICENSE for details.
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Playwright MCP Server
Provides a server utilizing Model Context Protocol to enable human-like browser automation with Playwright, allowing control over browser actions such as navigation, element interaction, and scrolling.
@kazuph/mcp-fetch
Model Context Protocol server for fetching web content and processing images. This allows Claude Desktop (or any MCP client) to fetch web content and handle images appropriately.
Apple MCP Server
Enables interaction with Apple apps like Messages, Notes, and Contacts through the MCP protocol to send messages, search, and open app content using natural language.
DuckDuckGo MCP Server
A Model Context Protocol (MCP) server that provides web search capabilities through DuckDuckGo, with additional features for content fetching and parsing.
contentful-mcp
Update, create, delete content, content-models and assets in your Contentful Space
YouTube Transcript MCP Server
This server retrieves transcripts for given YouTube video URLs, enabling integration with Goose CLI or Goose Desktop for transcript extraction and processing.
serper-search-scrape-mcp-server
This Serper MCP Server supports search and webpage scraping, and all the most recent parameters introduced by the Serper API, like location.