ask-gemini-mcp

ask-gemini-mcp

An MCP server for AI-to-AI collaboration via the Gemini CLI. Available on npm: ask-gemini-mcp. Works with Claude Code, Claude Desktop, Cursor, Warp, Copilot, and 40+ other MCP clients. Leverage Gemini's massive 1M+ token context window for large file and codebase analysis while your primary AI handles interaction and code editing.

Category
Visit Server

README

Ask Gemini MCP

<div align="center">

npm version npm downloads GitHub Release License: MIT

MCP server that connects any AI client to Google Gemini CLI

</div>

An MCP server for AI-to-AI collaboration via the Gemini CLI. Available on npm: ask-gemini-mcp. Works with Claude Code, Claude Desktop, Cursor, Warp, Copilot, and 40+ other MCP clients. Leverage Gemini's massive 1M+ token context window for large file and codebase analysis while your primary AI handles interaction and code editing.

Why?

  • Get a second opinion — Ask Gemini to review your coding approach before committing to it
  • Debate plans — Send architecture proposals to Gemini for critique and alternative suggestions
  • Review changes — Have Gemini analyze diffs or modified files to catch issues your primary AI might miss
  • Massive context — Gemini reads entire codebases (1M+ tokens) that would overflow other models

Quick Start

Claude Code

# Project scope (available in current project only)
claude mcp add gemini-cli -- npx -y ask-gemini-mcp

# User scope (available across all projects)
claude mcp add --scope user gemini-cli -- npx -y ask-gemini-mcp

Claude Desktop

Add to your config file (~/Library/Application Support/Claude/claude_desktop_config.json on macOS):

{
  "mcpServers": {
    "gemini-cli": {
      "command": "npx",
      "args": ["-y", "ask-gemini-mcp"]
    }
  }
}

<details> <summary>Other config file locations</summary>

  • Windows: %APPDATA%\Claude\claude_desktop_config.json
  • Linux: ~/.config/claude/claude_desktop_config.json

</details>

Cursor

Add to .cursor/mcp.json in your project (or ~/.cursor/mcp.json for global):

{
  "mcpServers": {
    "gemini-cli": {
      "command": "npx",
      "args": ["-y", "ask-gemini-mcp"]
    }
  }
}

Codex CLI

Add to ~/.codex/config.toml (or .codex/config.toml in your project):

[mcp_servers.gemini-cli]
command = "npx"
args = ["-y", "ask-gemini-mcp"]

Or via CLI:

codex mcp add gemini-cli -- npx -y ask-gemini-mcp

OpenCode

Add to opencode.json in your project (or ~/.config/opencode/opencode.json for global):

{
  "mcp": {
    "gemini-cli": {
      "type": "local",
      "command": ["npx", "-y", "ask-gemini-mcp"]
    }
  }
}

Any MCP Client (STDIO Transport)

{
  "transport": {
    "type": "stdio",
    "command": "npx",
    "args": ["-y", "ask-gemini-mcp"]
  }
}

Prerequisites

Tools

Tool Purpose
ask-gemini Send prompts to Gemini CLI. Supports @ file syntax, model selection, sandbox mode, and changeMode for structured edits
fetch-chunk Retrieve subsequent chunks from cached large responses
ping Connection test — verify MCP setup without using Gemini tokens

Usage Examples

File analysis (@ syntax):

  • ask gemini to analyze @src/main.js and explain what it does
  • use gemini to summarize @. the current directory

Code review:

  • ask gemini to review the changes in @src/auth.ts for security issues
  • use gemini to compare @old.js and @new.js

General questions:

  • ask gemini about best practices for React state management

Sandbox mode:

  • use gemini sandbox to create and run a Python script

Models

Model Use Case
gemini-3.1-pro-preview Default — best quality reasoning
gemini-3-flash-preview Faster responses, large codebases

The server automatically falls back to Flash when Pro quota is exceeded.

Contributing

Contributions are welcome! See open issues for things to work on.

License

MIT License. See LICENSE for details.

Disclaimer: This is an unofficial, third-party tool and is not affiliated with, endorsed, or sponsored by Google.

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
E2B

E2B

Using MCP to run code via e2b.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured