MCP Perplexica

MCP Perplexica

Enables LLMs to perform web searches through Perplexica with multiple focus modes (web, academic, YouTube, Reddit) and optimization settings, returning AI-generated responses with source citations.

Category
Visit Server

README

MCP Perplexica

MCP server proxy for Perplexica search API.

This server allows LLMs to perform web searches through Perplexica using the Model Context Protocol (MCP).

Features

  • šŸ” Web search through Perplexica
  • šŸ“š Multiple focus modes (web, academic, YouTube, Reddit, etc.)
  • ⚔ Configurable optimization modes (speed, balanced, quality)
  • šŸ”§ Customizable model configuration
  • šŸ“– Source citations in responses
  • šŸš€ Multiple transport modes (stdio, SSE, Streamable HTTP)

Prerequisites

  • Python 3.11+
  • UV package manager
  • Running Perplexica instance

Installation

  1. Clone the repository:
git clone https://github.com/Kaiohz/mcp-perplexica.git
cd mcp-perplexica
  1. Install dependencies with UV:
uv sync
  1. Create your environment file:
cp .env.example .env
  1. Edit .env with your configuration:
# Perplexica API
PERPLEXICA_URL=http://localhost:3000

# Transport: stdio (default), sse, or streamable-http
TRANSPORT=stdio
HOST=127.0.0.1
PORT=8000

# Model configuration
DEFAULT_CHAT_MODEL_PROVIDER_ID=your-provider-id
DEFAULT_CHAT_MODEL_KEY=anthropic/claude-sonnet-4.5
DEFAULT_EMBEDDING_MODEL_PROVIDER_ID=your-provider-id
DEFAULT_EMBEDDING_MODEL_KEY=openai/text-embedding-3-small

Usage

Transport Modes

The server supports three transport modes:

Transport Description Use Case
stdio Standard input/output CLI tools, Claude Desktop
sse Server-Sent Events over HTTP Web clients
streamable-http Streamable HTTP (recommended for production) Production deployments

Running with Docker Compose

The easiest way to run both Perplexica and MCP Perplexica together:

# Copy and configure environment files
cp .env.example .env
cp .env.perplexica.example .env.perplexica

# Edit .env with your MCP Perplexica settings
# Edit .env.perplexica with your Perplexica settings

# Start services
docker compose up -d

This starts:

  • Perplexica on http://localhost:3000
  • MCP Perplexica connected to Perplexica

Running the MCP Server (without Docker)

Stdio mode (default)

uv run python -m main

SSE mode

TRANSPORT=sse PORT=8000 uv run python -m main

Streamable HTTP mode

TRANSPORT=streamable-http PORT=8000 uv run python -m main

Claude Desktop Configuration

Add to your Claude Desktop configuration (~/Library/Application Support/Claude/claude_desktop_config.json on macOS):

{
  "mcpServers": {
    "perplexica": {
      "command": "uv",
      "args": ["run", "--directory", "/path/to/mcp-perplexica", "python", "-m", "main"],
      "env": {
        "PERPLEXICA_URL": "http://localhost:3000",
        "TRANSPORT": "stdio",
        "DEFAULT_CHAT_MODEL_PROVIDER_ID": "your-provider-id",
        "DEFAULT_CHAT_MODEL_KEY": "anthropic/claude-sonnet-4.5",
        "DEFAULT_EMBEDDING_MODEL_PROVIDER_ID": "your-provider-id",
        "DEFAULT_EMBEDDING_MODEL_KEY": "openai/text-embedding-3-small"
      }
    }
  }
}

Claude Code Configuration

For HTTP-based transports, you can add the server to Claude Code:

# Start the server with streamable-http transport
TRANSPORT=streamable-http PORT=8000 uv run python -m main

# Add to Claude Code
claude mcp add --transport http perplexica http://localhost:8000/mcp

Available Tools

search

Perform a web search using Perplexica.

Parameters:

Parameter Type Required Description
query string Yes The search query
focus_mode string No Search focus: webSearch, academicSearch, writingAssistant, wolframAlphaSearch, youtubeSearch, redditSearch
optimization_mode string No Optimization: speed, balanced, quality
system_instructions string No Custom instructions for AI response
chat_model_provider_id string No Override default chat model provider
chat_model_key string No Override default chat model
embedding_model_provider_id string No Override default embedding provider
embedding_model_key string No Override default embedding model

Example:

Search for "latest developments in AI" using academic focus

Development

Install dev dependencies

uv sync --dev

Run tests

uv run pytest

Run linter

uv run ruff check .
uv run ruff format .
uv run black src/

Architecture

This project follows hexagonal architecture:

src/
ā”œā”€ā”€ main.py              # MCP server entry point
ā”œā”€ā”€ config.py            # Pydantic Settings
ā”œā”€ā”€ dependencies.py      # Dependency injection
ā”œā”€ā”€ domain/              # Business core (pure Python)
│   ā”œā”€ā”€ entities.py      # Dataclasses
│   └── ports.py         # ABC interfaces
ā”œā”€ā”€ application/         # Use cases
│   ā”œā”€ā”€ requests.py      # Pydantic DTOs
│   └── use_cases.py     # Business logic
└── infrastructure/      # External adapters
    └── perplexica/
        └── adapter.py   # HTTP client

License

MIT

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
E2B

E2B

Using MCP to run code via e2b.

Official
Featured