
MCP Sage
An MCP server that provides tools for getting second opinions or detailed code reviews from Gemini 2.5 Pro by embedding file contents in prompts, allowing it to handle large codebases with extensive context.
README
mcp-sage
An MCP (Model Context Protocol) server that provides tools for sending prompts to another LLM (currently only Gemini 2.5 Pro) that embed all referenced filepaths (recursively for folders) in the prompt. Useful for getting second opinions or detailed code reviews from a model that can handle tons of context accurately.
Rationale
I make heavy use of Claude Code. It's a great product that works well for my workflow. Newer models with large amounts of context seem really useful though for dealing with more complex codebases where more context is needed. This lets me continue to use Claude Code as a development tool while leveraging the large context of Gemini 2.5 Pro to augment Claude Code's limited context.
Inspiration
This project draws inspiration from two other open source projects:
- simonw/files-to-prompt for the file compression
- asadm/vibemode for the idea and prompt to send the entire repo to Gemini for wholesale edit suggestions
Overview
This project implements an MCP server that exposes two tools:
second-opinion
- Takes a prompt and a list of file/dir paths as input
- Packs the files into a structured XML format
- Checks if the combined content is within Gemini's token limit (1M tokens)
- Sends the combined prompt + context to Gemini 2.5 Pro
- Returns the model's response
expert-review
- Takes an instruction for code changes and a list of file/dir paths as input
- Packs the files into a structured XML format
- Checks if the combined content is within Gemini's token limit (1M tokens)
- Creates a specialized prompt instructing the model to format responses using SEARCH/REPLACE blocks
- Sends the combined context + instruction to Gemini 2.5 Pro
- Returns edit suggestions formatted as SEARCH/REPLACE blocks for easy implementation
Prerequisites
- Node.js (v18 or later)
- A Google Gemini API key
Installation
# Clone the repository
git clone https://github.com/your-username/mcp-sage.git
cd mcp-sage
# Install dependencies
npm install
# Build the project
npm run build
Environment Variables
Set the following environment variable:
GEMINI_API_KEY
: Your Google Gemini API key
Usage
After building with npm run build
, add the following to your MCP configuration:
GEMINI_API_KEY=XXX node /path/to/this/repo/dist/index.js
Prompting
To get a second opinion on something just ask for a second opinion.
To get a code review, ask for a code review or expert review.
Both of these benefit from providing paths of files that you wnat to be included in context, but if omitted the host LLM will probably infer what to include.
Debugging and Monitoring
The server provides detailed monitoring information via the MCP logging capability. These logs include:
- Token usage statistics (tokens used vs. token limit)
- Number of files and documents included in the request
- Request processing time metrics
- Error information when token limits are exceeded
Logs are sent via the MCP protocol's notifications/message
method, ensuring they don't interfere with the JSON-RPC communication. MCP clients with logging support will display these logs appropriately.
Example log entries:
Token usage: 1,234 / 1,000,000 tokens (0.12%)
Files included: 3, Document count: 3
Sending request to Gemini with 1,234 tokens...
Received response from Gemini in 982ms
Using the Tools
second-opinion Tool
The second-opinion
tool accepts the following parameters:
prompt
(string, required): The prompt to send to Geminipaths
(array of strings, required): List of file paths to include as context
Example MCP tool call (using JSON-RPC 2.0):
{
"jsonrpc": "2.0",
"id": 1,
"method": "tools/call",
"params": {
"name": "second-opinion",
"arguments": {
"prompt": "Explain how this code works",
"paths": ["path/to/file1.js", "path/to/file2.js"]
}
}
}
expert-review Tool
The expert-review
tool accepts the following parameters:
instruction
(string, required): The specific changes or improvements neededpaths
(array of strings, required): List of file paths to include as context
Example MCP tool call (using JSON-RPC 2.0):
{
"jsonrpc": "2.0",
"id": 1,
"method": "tools/call",
"params": {
"name": "expert-review",
"arguments": {
"instruction": "Add error handling to the function",
"paths": ["path/to/file1.js", "path/to/file2.js"]
}
}
}
The response will contain SEARCH/REPLACE blocks that you can use to implement the suggested changes:
<<<<<<< SEARCH
function getData() {
return fetch('/api/data')
.then(res => res.json());
}
=======
function getData() {
return fetch('/api/data')
.then(res => {
if (!res.ok) {
throw new Error(`HTTP error! Status: ${res.status}`);
}
return res.json();
})
.catch(error => {
console.error('Error fetching data:', error);
throw error;
});
}
>>>>>>> REPLACE
Running the Tests
To test the tools:
# Test the second-opinion tool
GEMINI_API_KEY=your_api_key_here node test/run-test.js
# Test the expert-review tool
GEMINI_API_KEY=your_api_key_here node test/test-expert.js
Project Structure
src/index.ts
: The main MCP server implementation with tool definitionssrc/pack.ts
: Tool for packing files into a structured XML formatsrc/tokenCounter.ts
: Utilities for counting tokens in a promptsrc/gemini.ts
: Gemini API client implementationtest/run-test.js
: Test for the second-opinion tooltest/test-expert.js
: Test for the expert-review tool
License
ISC
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.