Deep Research MCP Server
Enables iterative deep research by integrating AI agents with search engines, web scraping, and large language models for efficient data gathering and comprehensive reporting.
Ozamatash
README
Open Deep Research MCP Server
An AI-powered research assistant that performs deep, iterative research on any topic. It combines search engines, web scraping, and AI to explore topics in depth and generate comprehensive reports. Available as a Model Context Protocol (MCP) tool or standalone CLI. Look at exampleout.md to see what a report might look like.
Quick Start
- Clone and install:
git clone https://github.com/Ozamatash/deep-research
cd deep-research
npm install
- Set up environment in
.env.local
:
# Copy the example environment file
cp .env.example .env.local
- Build:
# Build the server
npm run build
- Run the cli version:
npm run start "Your research query here"
- Test MCP Server with Claude Desktop:
Follow the guide thats at the bottom of server quickstart to add the server to Claude Desktop:
https://modelcontextprotocol.io/quickstart/server
Features
- Performs deep, iterative research by generating targeted search queries
- Controls research scope with depth (how deep) and breadth (how wide) parameters
- Evaluates source reliability with detailed scoring (0-1) and reasoning
- Prioritizes high-reliability sources (≥0.7) and verifies less reliable information
- Generates follow-up questions to better understand research needs
- Produces detailed markdown reports with findings, sources, and reliability assessments
- Available as a Model Context Protocol (MCP) tool for AI agents
- For now MCP version doesn't ask follow up questions
How It Works
flowchart TB
subgraph Input
Q[User Query]
B[Breadth Parameter]
D[Depth Parameter]
FQ[Feedback Questions]
end
subgraph Research[Deep Research]
direction TB
SQ[Generate SERP Queries]
SR[Search]
RE[Source Reliability Evaluation]
PR[Process Results]
end
subgraph Results[Research Output]
direction TB
L((Learnings with
Reliability Scores))
SM((Source Metadata))
ND((Next Directions:
Prior Goals,
New Questions))
end
%% Main Flow
Q & FQ --> CQ[Combined Query]
CQ & B & D --> SQ
SQ --> SR
SR --> RE
RE --> PR
%% Results Flow
PR --> L
PR --> SM
PR --> ND
%% Depth Decision and Recursion
L & ND --> DP{depth > 0?}
DP -->|Yes| SQ
%% Final Output
DP -->|No| MR[Markdown Report]
%% Styling
classDef input fill:#7bed9f,stroke:#2ed573,color:black
classDef process fill:#70a1ff,stroke:#1e90ff,color:black
classDef output fill:#ff4757,stroke:#ff6b81,color:black
classDef results fill:#a8e6cf,stroke:#3b7a57,color:black,width:150px,height:150px
class Q,B,D,FQ input
class SQ,SR,RE,PR process
class MR output
class L,SM,ND results
Advanced Setup
Using Local Firecrawl (Free Option)
Instead of using the Firecrawl API, you can run a local instance. You can use the official repo or my fork which uses searXNG as the search backend to avoid using a searchapi key:
- Set up local Firecrawl:
git clone https://github.com/Ozamatash/localfirecrawl
cd localfirecrawl
# Follow setup in localfirecrawl README
- Update
.env.local
:
FIRECRAWL_BASE_URL="http://localhost:3002"
Optional: Observability
Add observability to track research flows, queries, and results using Langfuse:
# Add to .env.local
LANGFUSE_PUBLIC_KEY="your_langfuse_public_key"
LANGFUSE_SECRET_KEY="your_langfuse_secret_key"
The app works normally without observability if no Langfuse keys are provided.
License
MIT License
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Playwright MCP Server
Provides a server utilizing Model Context Protocol to enable human-like browser automation with Playwright, allowing control over browser actions such as navigation, element interaction, and scrolling.
@kazuph/mcp-fetch
Model Context Protocol server for fetching web content and processing images. This allows Claude Desktop (or any MCP client) to fetch web content and handle images appropriately.
Apple MCP Server
Enables interaction with Apple apps like Messages, Notes, and Contacts through the MCP protocol to send messages, search, and open app content using natural language.
DuckDuckGo MCP Server
A Model Context Protocol (MCP) server that provides web search capabilities through DuckDuckGo, with additional features for content fetching and parsing.
contentful-mcp
Update, create, delete content, content-models and assets in your Contentful Space