OpenRouter MCP Multimodal Server
Provides chat and image analysis capabilities through OpenRouter.ai's diverse model ecosystem, enabling both text conversations and powerful multimodal image processing with various AI models.
stabgan
README
OpenRouter MCP Multimodal Server
An MCP (Model Context Protocol) server that provides chat and image analysis capabilities through OpenRouter.ai's diverse model ecosystem. This server combines text chat functionality with powerful image analysis capabilities.
Features
-
Text Chat:
- Direct access to all OpenRouter.ai chat models
- Support for simple text and multimodal conversations
- Configurable temperature and other parameters
-
Image Analysis:
- Analyze single images with custom questions
- Process multiple images simultaneously
- Automatic image resizing and optimization
- Support for various image sources (local files, URLs, data URLs)
-
Model Selection:
- Search and filter available models
- Validate model IDs
- Get detailed model information
- Support for default model configuration
-
Performance Optimization:
- Smart model information caching
- Exponential backoff for retries
- Automatic rate limit handling
What's New in 1.5.0
-
Improved OS Compatibility:
- Enhanced path handling for Windows, macOS, and Linux
- Better support for Windows-style paths with drive letters
- Normalized path processing for consistent behavior across platforms
-
MCP Configuration Support:
- Cursor MCP integration without requiring environment variables
- Direct configuration via MCP parameters
- Flexible API key and model specification options
-
Robust Error Handling:
- Improved fallback mechanisms for image processing
- Better error reporting with specific diagnostics
- Multiple backup strategies for file reading
-
Image Processing Enhancements:
- More reliable base64 encoding for all image types
- Fallback options when Sharp module is unavailable
- Better handling of large images with automatic optimization
Installation
Option 1: Install via npm
npm install -g @stabgan/openrouter-mcp-multimodal
Option 2: Run via Docker
docker run -i -e OPENROUTER_API_KEY=your-api-key-here stabgandocker/openrouter-mcp-multimodal:latest
Quick Start Configuration
Prerequisites
- Get your OpenRouter API key from OpenRouter Keys
- Choose a default model (optional)
MCP Configuration Options
Add one of the following configurations to your MCP settings file (e.g., cline_mcp_settings.json
or claude_desktop_config.json
):
Option 1: Using npx (Node.js)
{
"mcpServers": {
"openrouter": {
"command": "npx",
"args": [
"-y",
"@stabgan/openrouter-mcp-multimodal"
],
"env": {
"OPENROUTER_API_KEY": "your-api-key-here",
"DEFAULT_MODEL": "qwen/qwen2.5-vl-32b-instruct:free"
}
}
}
}
Option 2: Using uv (Python Package Manager)
{
"mcpServers": {
"openrouter": {
"command": "uv",
"args": [
"run",
"-m",
"openrouter_mcp_multimodal"
],
"env": {
"OPENROUTER_API_KEY": "your-api-key-here",
"DEFAULT_MODEL": "qwen/qwen2.5-vl-32b-instruct:free"
}
}
}
}
Option 3: Using Docker
{
"mcpServers": {
"openrouter": {
"command": "docker",
"args": [
"run",
"--rm",
"-i",
"-e", "OPENROUTER_API_KEY=your-api-key-here",
"-e", "DEFAULT_MODEL=qwen/qwen2.5-vl-32b-instruct:free",
"stabgandocker/openrouter-mcp-multimodal:latest"
]
}
}
}
Option 4: Using Smithery (recommended)
{
"mcpServers": {
"openrouter": {
"command": "smithery",
"args": [
"run",
"stabgan/openrouter-mcp-multimodal"
],
"env": {
"OPENROUTER_API_KEY": "your-api-key-here",
"DEFAULT_MODEL": "qwen/qwen2.5-vl-32b-instruct:free"
}
}
}
}
Examples
For comprehensive examples of how to use this MCP server, check out the examples directory. We provide:
- JavaScript examples for Node.js applications
- Python examples with interactive chat capabilities
- Code snippets for integrating with various applications
Each example comes with clear documentation and step-by-step instructions.
Dependencies
This project uses the following key dependencies:
@modelcontextprotocol/sdk
: ^1.8.0 - Latest MCP SDK for tool implementationopenai
: ^4.89.1 - OpenAI-compatible API client for OpenRoutersharp
: ^0.33.5 - Fast image processing libraryaxios
: ^1.8.4 - HTTP client for API requestsnode-fetch
: ^3.3.2 - Modern fetch implementation
Node.js 18 or later is required. All dependencies are regularly updated to ensure compatibility and security.
Available Tools
mcp_openrouter_chat_completion
Send text or multimodal messages to OpenRouter models:
use_mcp_tool({
server_name: "openrouter",
tool_name: "mcp_openrouter_chat_completion",
arguments: {
model: "google/gemini-2.5-pro-exp-03-25:free", // Optional if default is set
messages: [
{
role: "system",
content: "You are a helpful assistant."
},
{
role: "user",
content: "What is the capital of France?"
}
],
temperature: 0.7 // Optional, defaults to 1.0
}
});
For multimodal messages with images:
use_mcp_tool({
server_name: "openrouter",
tool_name: "mcp_openrouter_chat_completion",
arguments: {
model: "anthropic/claude-3.5-sonnet",
messages: [
{
role: "user",
content: [
{
type: "text",
text: "What's in this image?"
},
{
type: "image_url",
image_url: {
url: "https://example.com/image.jpg"
}
}
]
}
]
}
});
Recommended Servers
Crypto Price & Market Analysis MCP Server
A Model Context Protocol (MCP) server that provides comprehensive cryptocurrency analysis using the CoinCap API. This server offers real-time price data, market analysis, and historical trends through an easy-to-use interface.
MCP PubMed Search
Server to search PubMed (PubMed is a free, online database that allows users to search for biomedical and life sciences literature). I have created on a day MCP came out but was on vacation, I saw someone post similar server in your DB, but figured to post mine.
dbt Semantic Layer MCP Server
A server that enables querying the dbt Semantic Layer through natural language conversations with Claude Desktop and other AI assistants, allowing users to discover metrics, create queries, analyze data, and visualize results.
mixpanel
Connect to your Mixpanel data. Query events, retention, and funnel data from Mixpanel analytics.

Sequential Thinking MCP Server
This server facilitates structured problem-solving by breaking down complex issues into sequential steps, supporting revisions, and enabling multiple solution paths through full MCP integration.

Nefino MCP Server
Provides large language models with access to news and information about renewable energy projects in Germany, allowing filtering by location, topic (solar, wind, hydrogen), and date range.
Vectorize
Vectorize MCP server for advanced retrieval, Private Deep Research, Anything-to-Markdown file extraction and text chunking.
Mathematica Documentation MCP server
A server that provides access to Mathematica documentation through FastMCP, enabling users to retrieve function documentation and list package symbols from Wolfram Mathematica.
kb-mcp-server
An MCP server aimed to be portable, local, easy and convenient to support semantic/graph based retrieval of txtai "all in one" embeddings database. Any txtai embeddings db in tar.gz form can be loaded
Research MCP Server
The server functions as an MCP server to interact with Notion for retrieving and creating survey data, integrating with the Claude Desktop Client for conducting and reviewing surveys.