
Perplexity Insight MCP Server
Interfaces with the Perplexity AI API to provide advanced question answering capabilities through the standardized Model Context Protocol, supporting multiple Perplexity models.
README
Perplexity Insight MCP Server
An MCP server implementation for interfacing with the Perplexity AI API, providing advanced question answering capabilities through the standardised Model Context Protocol.
Features
- Seamless integration with Perplexity AI API
- Support for different Perplexity models (sonar-reasoning, sonar-pro, sonar-deep-research)
- Customisable system prompts and user queries
- Proper error handling and response formatting
- Rate limiting protection
- Easy integration with Windsurf IDE
Requirements
- Node.js 18+
- Perplexity API key
Installation
npm install
Environment Variables
Create a .env
file with the following variables:
PERPLEXITY_API_KEY=your_api_key_here
Usage
Run the server:
npm start
API Tools
The server exposes the following tools:
perplexity_ask
- Send a direct question to Perplexity AIperplexity_search
- Perform a search query with Perplexity AI
Changing Models
Both tools support the following Perplexity models:
sonar-reasoning
(default) - Perplexity's reasoning-focused model, best for general questionssonar-pro
- Enhanced model with improved capabilities for professional use casessonar-deep-research
- Specialised for in-depth research and complex queries
To specify a model when using the tools, include the model
parameter in your request:
Ask Perplexity using sonar-deep-research: What are the latest advancements in quantum computing?
You can also customise the system prompt and maximum token count:
Search with Perplexity using sonar-pro with system prompt "You are a helpful research assistant" and max tokens 2000: Latest developments in renewable energy
Tool Response Format
The server follows the MCP specification for tool responses:
{
content: [
{
type: "text",
text: "Response content from Perplexity AI"
}
],
isError: false // or true if an error occurred
}
Windsurf Integration
Setting up in Windsurf
-
Build the server:
npm run build
-
Open Windsurf and navigate to Settings
-
Find the "AI Settings" or "Model Context Protocol" section
-
Add a new MCP server with the following details:
- Name: Perplexity Insight
- Type: Local Process
- Command: Path to your Node.js executable
- Arguments: Path to your compiled
index.js
file - Working Directory: Path to your project directory
- Environment Variables: Make sure to include
PERPLEXITY_API_KEY=your_api_key_here
-
Enable the server and restart Windsurf if necessary
Example Configuration
Here's an example configuration for the mcp_config.json
file:
"perplexity-ask": {
"command": "node",
"args": [
"/path/to/perplexity-insight-MCP/dist/index.js"
],
"cwd": "/path/to/perplexity-insight-MCP",
"env": {
"PERPLEXITY_API_KEY": "pplx-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
}
}
Replace /path/to/perplexity-insight-MCP
with the actual path to your installation directory and use your actual Perplexity API key.
Using Perplexity in Windsurf
- Use the AI Assistant panel to ask questions that will be directed to Perplexity
- For web searches, include specific terms like "search for" in your queries
- To change models, include the model name in your query as shown in the "Changing Models" section
- Windsurf will automatically use the appropriate Perplexity tool based on your query
Development
For local development:
npm run dev
Troubleshooting
If you encounter issues with the MCP server:
- Check that your API key is valid and properly set in the
.env
file - Verify that the response format matches the MCP specification
- Look for any error messages in the server logs
- Ensure Windsurf is properly configured to use the MCP server
License
MIT
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.