OpenRouter MCP Server
An MCP server for discovering and querying over 300 AI models available on OpenRouter. It enables users to list, search, filter, compare, and get detailed information about models with pricing, context limits, and capabilities.
README
OpenRouter MCP Server
MCP (Model Context Protocol) server for discovering and querying 300+ AI models available on OpenRouter.
Features
- List models — Browse all available models with pricing, context limits, and capabilities
- Search & filter — Find models by provider, price, context length, features (tools, vision, etc.)
- Compare models — Side-by-side comparison of multiple models
- Get details — Full metadata for any specific model
- Cached responses — 5-minute cache to reduce API calls
Installation
pip install openrouter-mcp
Usage
With OpenClaw
Add to your openclaw.json MCP servers config:
{
"mcp": {
"servers": {
"openrouter-models": {
"command": "openrouter-mcp",
"env": {
"OPENROUTER_API_KEY": "your-api-key"
}
}
}
}
}
Then restart the gateway. Agents can now use the MCP tools to query OpenRouter models.
Note:
OPENROUTER_API_KEYis optional but recommended for higher rate limits (200 req/min vs 20 req/min). Get your key at: https://openrouter.ai/keys
Example agent usage:
# Agent can now call MCP tools like:
list_models(sort_by="context_length")
search_models(query="claude", max_input_price=5.0)
get_model(model_id="anthropic/claude-sonnet-4.6")
compare_models(model_ids="qwen/qwen3.6-plus,anthropic/claude-sonnet-4.6")
Standalone (stdio)
export OPENROUTER_API_KEY=your-key
python -m openrouter_mcp.server
Available Tools
| Tool | Description |
|---|---|
list_models |
List all models with optional modality filter and sorting |
get_model |
Get detailed info for a specific model by ID |
search_models |
Search and filter models by query, provider, price, context, features |
compare_models |
Compare multiple models side by side |
refresh_cache |
Force refresh the model cache from OpenRouter API |
Examples
List models sorted by context length
{
"name": "list_models",
"arguments": {
"modality": "text",
"sort_by": "context_length"
}
}
Search for Claude models under $5/1M tokens
{
"name": "search_models",
"arguments": {
"query": "claude",
"provider": "anthropic",
"max_input_price": 5.0,
"requires_tools": true
}
}
Compare 3 models
{
"name": "compare_models",
"arguments": {
"model_ids": "anthropic/claude-sonnet-4.6,qwen/qwen3.6-plus,openai/gpt-5.4"
}
}
Get model details
{
"name": "get_model",
"arguments": {
"model_id": "anthropic/claude-sonnet-4.6"
}
}
API Reference
list_models(modality, sort_by)
modality(str, default: "text"): Filter by output type. Options:text,image,audio,embeddings,allsort_by(str, default: "name"): Sort by:name,created,price,context_length
get_model(model_id)
model_id(str, required): Model slug, e.g.anthropic/claude-sonnet-4.6
search_models(query, provider, max_input_price, min_context, requires_tools, requires_vision, free_only)
query(str): Free-text search in model name/id/descriptionprovider(str): Filter by provider (e.g.anthropic,google,openai)max_input_price(float): Max input price per 1M tokens (0 = no limit)min_context(int): Minimum context window sizerequires_tools(bool): Only models supporting tool callingrequires_vision(bool): Only models with vision/image inputfree_only(bool): Only free models
compare_models(model_ids)
model_ids(str, required): Comma-separated list of model IDs
refresh_cache()
Force refresh the model cache from OpenRouter API.
Rate Limits
- Without API key: 20 requests/minute
- With API key: 200 requests/minute
- Model data is cached for 5 minutes
Get your API key at: https://openrouter.ai/keys
License
MIT
Contributing
Contributions welcome! Please open an issue or PR on GitHub.
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.