
Gemini MCP Tool
A Model Context Protocol server that enables AI assistants to interact with Google Gemini CLI, allowing them to leverage Gemini's large token window for analyzing files and codebases using natural language commands.
README
Gemini MCP Tool
This is a simple Model Context Protocol (MCP) server that allows AI assistants to interact with the Google Gemini CLI. It enables the AI to leverage the power of Gemini's massive token window for large analysis, especially with large files and codebases using the @
syntax for direction.
TLDR:
-
- Install, ask claude naturally to use gemini, save tokens.
-
- Add this to your claude config.
"gemini-cli": {
"command": "npx",
"args": ["-y", "gemini-mcp-tool"]
}
-
- Run 'claude mcp add-from-claude-desktop' where you want to use gemini-cli as an mcp,
-
- Make sure that you have selected the MCPs you want to import (it defaults to all)
-
- then run claude code in the same dir. this dir will now be configured.
Prerequisites
- Node.js (v16.0.0 or higher)
- Google Gemini CLI installed and configured on your system.
Installation
You can use this tool without installation via npx
, which is the recommended approach. However, if you prefer to install it globally, you can do so.
# To install globally (optional)
npm install -g gemini-mcp-tool
Configuration
You need to register the MCP server with your MCP client (e.g., Claude Code, Claude Desktop, ect).
Recommended: Using npx
(No Installation Needed)
Add the following to your claude desktop configuration file:
{
"mcpServers": {
"gemini-cli": {
"command": "npx",
"args": ["-y", "gemini-mcp-tool"]
}
}
}
Using a Global Installation
If you installed the package globally, use this configuration:
{
"mcpServers": {
"gemini-cli": {
"command": "gemini-mcp"
}
}
}
Configuration File Locations:
- Claude Desktop:
- macOS:
~/Library/Application Support/Claude/claude_desktop_config.json
- Windows:
%APPDATA%\Claude\claude_desktop_config.json
- Linux:
~/.config/claude/claude_desktop_config.json
- macOS:
After updating the configuration, restart your terminal session.
Available Commands
- Use natural language: "use gemini to explain index.html", "understand the massive project using gemini", "ask gemini to search for latest news"
- Claude Code: type /gemini-cli and commands should populate in claude code.
Usage Examples
With File References (using @ syntax)
ask gemini to analyze @src/main.js and explain what it does
use gemini to summarize @. the current directory
analyze @package.json and tell me about dependencies
General Questions (without files)
ask gemini to search for the latest tech news
use gemini to explain quantum computing
ask gemini about best practices for React development related to @file_im_confused_about
Using Gemini CLI's Sandbox Mode (-s)
The sandbox mode allows you to safely test code changes, run scripts, or execute potentially risky operations in an isolated environment.
use gemini sandbox to create and run a Python script that processes data
ask gemini to safely test @script.py and explain what it does
use gemini sandbox to install numpy and create a data visualization
test this code safely: Create a script that makes HTTP requests to an API
Tools (for the AI)
These tools are designed to be used by the AI assistant.
ask-gemini
: Asks Google Gemini for its perspective. Can be used for general questions or complex analysis of files.prompt
(required): The analysis request. Use the@
syntax to include file or directory references (e.g.,@src/main.js explain this code
) or ask general questions (e.g.,Please use a web search to find the latest news stories
).model
(optional): The Gemini model to use. Defaults togemini-2.5-flash
.sandbox
(optional): Set totrue
to run in sandbox mode for safe code execution.
sandbox-test
: Safely executes code or commands in Gemini's sandbox environment. Always runs in sandbox mode.prompt
(required): Code testing request (e.g.,Create and run a Python script that...
or@script.py Run this safely
).model
(optional): The Gemini model to use.
Ping
: A simple test tool that echoes back a message.Help
: Shows the Gemini CLI help text.
Slash Commands (for the User)
You can use these commands directly in claude codes face (havent tested other clients).
- /analyze: Analyzes files or directories using Gemini, or asks general questions.
prompt
(required): The analysis prompt. Use@
syntax to include files (e.g.,/analyze prompt:@src/ summarize this directory
) or ask general questions (e.g.,/analyze prompt:Please use a web search to find the latest news stories
).
- /sandbox: Safely tests code or scripts in Gemini's sandbox environment.
prompt
(required): Code testing request (e.g.,/sandbox prompt:Create and run a Python script that processes CSV data
or/sandbox prompt:@script.py Test this script safely
).
- /help: Displays the Gemini CLI help information.
- /ping: Tests the connection to the server.
message
(optional): A message to echo back.
Contributing
Contributions are welcome! Please feel free to fork the repository, make your changes, and open a pull request.
License
This project is licensed under the MIT License. See the LICENSE file for details.
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.