Tavily Search
An MCP server implementation that integrates the Tavily Search API, providing optimized search capabilities for LLMs.
apappascs
Tools
tavily_search
Performs a web search using the Tavily Search API, optimized for LLMs. Use this for broad information gathering, recent events, or when you need diverse web sources. Supports search depth, topic selection, time range filtering, and domain inclusion/exclusion.
README
Tavily Search MCP Server
An MCP server implementation that integrates the Tavily Search API, providing optimized search capabilities for LLMs.
<a href="https://glama.ai/mcp/servers/0kmdibf9t1"><img width="380" height="200" src="https://glama.ai/mcp/servers/0kmdibf9t1/badge" alt="tavily-search-mcp-server MCP server" /></a>
Features
- Web Search: Perform web searches optimized for LLMs, with control over search depth, topic, and time range.
- Content Extraction: Extracts the most relevant content from search results, optimizing for quality and size.
- Optional Features: Include images, image descriptions, short LLM-generated answers, and raw HTML content.
- Domain Filtering: Include or exclude specific domains in search results.
Tools
- tavily_search
- Execute web searches using the Tavily Search API.
- Inputs:
query(string, required): The search query.search_depth(string, optional): "basic" or "advanced" (default: "basic").topic(string, optional): "general" or "news" (default: "general").days(number, optional): Number of days back for news search (default: 3).time_range(string, optional): Time range filter ("day", "week", "month", "year" or "d", "w", "m", "y").max_results(number, optional): Maximum number of results (default: 5).include_images(boolean, optional): Include related images (default: false).include_image_descriptions(boolean, optional): Include descriptions for images (default: false).include_answer(boolean, optional): Include a short LLM-generated answer (default: false).include_raw_content(boolean, optional): Include raw HTML content (default: false).include_domains(string[], optional): Domains to include.exclude_domains(string[], optional): Domains to exclude.
Setup Guide 🚀
1. Prerequisites
- Claude Desktop installed on your computer.
- A Tavily API key: a. Sign up for a Tavily API account. b. Choose a plan (Free tier available). c. Generate your API key from the Tavily dashboard.
2. Installation
-
Clone this repository somewhere on your computer:
git clone https://github.com/apappascs/tavily-search-mcp-server.git -
Install dependencies & build the project:
cd tavily-search-mcp-servernpm installnpm run build
3. Integration with Claude Desktop
-
Open your Claude Desktop configuration file:
# On Mac: ~/Library/Application\ Support/Claude/claude_desktop_config.json # On Windows: %APPDATA%\Claude\claude_desktop_config.json -
Add one of the following to the
mcpServersobject in your config, depending on whether you want to run the server usingnpmordocker:Option A: Using NPM (stdio transport)
{ "mcpServers": { "tavily-search-server": { "command": "node", "args": [ "/Users/<username>/<FULL_PATH...>/tavily-search-mcp-server/dist/index.js" ], "env": { "TAVILY_API_KEY": "your_api_key_here" } } } }Option B: Using NPM (SSE transport)
{ "mcpServers": { "tavily-search-server": { "command": "node", "args": [ "/Users/<username>/<FULL_PATH...>/tavily-search-mcp-server/dist/sse.js" ], "env": { "TAVILY_API_KEY": "your_api_key_here" }, "port": 3001 } } }Option C: Using Docker
{ "mcpServers": { "tavily-search-server": { "command": "docker", "args": [ "run", "-i", "--rm", "-e", "TAVILY_API_KEY", "-v", "/Users/<username>/<FULL_PATH...>/tavily-search-mcp-server:/app", "tavily-search-mcp-server" ], "env": { "TAVILY_API_KEY": "your_api_key_here" } } } } -
Important Steps:
- Replace
/Users/<username>/<FULL_PATH...>/tavily-search-mcp-serverwith the actual full path to where you cloned the repository. - Add your Tavily API key in the
envsection. It's always better to have secrets like API keys as environment variables. - Make sure to use forward slashes (
/) in the path, even on Windows. - If you are using docker make sure you build the image first using
docker build -t tavily-search-mcp-server:latest .
- Replace
-
Restart Claude Desktop for the changes to take effect.
Installing via Smithery
To install Tavily Search for Claude Desktop automatically via Smithery:
npx -y @smithery/cli install @apappascs/tavily-search-mcp-server --client claude
Environment Setup (for npm)
-
Copy
.env.exampleto.env:cp .env.example .env -
Update the
.envfile with your actual Tavily API key:TAVILY_API_KEY=your_api_key_hereNote: Never commit your actual API key to version control. The
.envfile is ignored by git for security reasons.
Running with NPM
Start the server using Node.js:
node dist/index.js
For sse transport:
node dist/sse.js
Running with Docker
-
Build the Docker image (if you haven't already):
docker build -t tavily-search-mcp-server:latest . -
Run the Docker container with:
For stdio transport:
docker run -it --rm -e TAVILY_API_KEY="your_api_key_here" tavily-search-mcp-server:latestFor sse transport:
docker run -it --rm -p 3001:3001 -e TAVILY_API_KEY="your_api_key_here" -e TRANSPORT="sse" tavily-search-mcp-server:latestYou can also leverage your shell's environment variables directly, which is a more secure practice:
docker run -it --rm -p 3001:3001 -e TAVILY_API_KEY=$TAVILY_API_KEY -e TRANSPORT="sse" tavily-search-mcp-server:latestNote: The second command demonstrates the recommended approach of using
-e TAVILY_API_KEY=$TAVILY_API_KEYto pass the value of yourTAVILY_API_KEYenvironment variable into the Docker container. This keeps your API key out of your command history, and it is generally preferred over hardcoding secrets in commands. -
Using docker compose
Run:
docker compose up -dTo stop the server:
docker compose down
License
This MCP server is licensed under the MIT License. This means you are free to use, modify, and distribute the software, subject to the terms and conditions of the MIT License. For more details, please see the LICENSE file in the project repository.
Recommended Servers
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
E2B
Using MCP to run code via e2b.
Persistent Knowledge Graph
An implementation of persistent memory for Claude using a local knowledge graph, allowing the AI to remember information about users across conversations with customizable storage location.
Hyperbrowser MCP Server
Welcome to Hyperbrowser, the Internet for AI. Hyperbrowser is the next-generation platform empowering AI agents and enabling effortless, scalable browser automation. Built specifically for AI developers, it eliminates the headaches of local infrastructure and performance bottlenecks, allowing you to
AIO-MCP Server
🚀 All-in-one MCP server with AI search, RAG, and multi-service integrations (GitLab/Jira/Confluence/YouTube) for AI-enhanced development workflows. Folk from
Search1API MCP Server
A Model Context Protocol (MCP) server that provides search and crawl functionality using Search1API.
Supabase MCP Server (used by Deploya.dev)
Enables Cursor and Windsurf to safely interact with Supabase databases by providing tools for database management, SQL query execution, and Supabase Management API access with built-in safety controls.
Database Explorer
A powerful Model Context Protocol (MCP) tool for exploring and managing different types of databases including PostgreSQL, MySQL, and Firestore.
Supabase MCP Server
This server enables interaction with Supabase PostgreSQL databases through the MCP protocol, allowing seamless integration with Cursor and Windsurf IDEs for secure and validated database management.