Claude-Gemini Collaborative Integration
Enables seamless real-time collaboration between Claude Code and the Gemini CLI for discussing ideas and refining solutions. It provides tools to initiate collaborative sessions, consult Gemini with context preservation, and access conversation history.
README
Claude-Gemini Collaborative Integration
This project enables seamless collaboration between Claude Code and Gemini CLI, allowing them to work together on tasks, discuss ideas, and refine solutions in real-time. This integration leverages the Model Context Protocol (MCP) to facilitate communication and tool usage between the two AI agents.
Features
- Start Collaborative Sessions: Initiate a new discussion with Gemini on a specific topic.
- Consult Gemini: Ask questions or provide context to Gemini within an ongoing collaboration.
- Retrieve Conversation History: Access the full transcript of a collaborative session.
- Context Preservation: Gemini remembers previous interactions within a session, allowing for iterative discussions.
- Rate Limiting Handling: The collaborative server automatically manages rate limits for Gemini API calls.
Prerequisites
Before you begin, ensure you have the following installed:
- Python 3.8+: The project is built with Python.
pip: Python package installer.- Claude Desktop: The application that hosts Claude Code.
- Gemini CLI: Ensure the Gemini CLI is installed and configured with your API key. You can verify its installation by running
gemini --versionin your terminal.
Setup Guide
Follow these steps to set up the Claude-Gemini collaborative integration:
1. Clone the Repository
First, clone this repository to your local machine:
git clone <repository_url>
cd claude-gemini-integration
(Replace <repository_url> with the actual URL of your repository)
2. Create and Activate a Python Virtual Environment
It's recommended to use a virtual environment to manage dependencies:
python3 -m venv venv
source venv/bin/activate
3. Install Dependencies
Install the required Python packages using pip:
pip install -r requirements.txt
4. Start the Collaborative Server
The collaborative server acts as an intermediary between Claude Code and Gemini. It needs to be running in the background.
# Navigate to the project directory if you're not already there
cd /Users/jamiearonson/Documents/claude-gemini-integration
# Activate your virtual environment
source venv/bin/activate
# Start the server in the background
python tools/mcp/collaborative-server.py &
You can verify the server is running by checking its health endpoint:
curl http://localhost:8080/health
You should see a response like {"status": "ok"}.
5. Configure Claude Desktop for MCP
You need to tell Claude Desktop about the new MCP server. This involves adding a configuration snippet to your claude_desktop_config.json file.
Locate your claude_desktop_config.json file:
- macOS:
~/Library/Application Support/Claude/claude_desktop_config.json - Windows:
%APPDATA%\Claude\claude_desktop_config.json - Linux:
~/.config/Claude/claude_desktop_config.json
Add the following JSON snippet to the mcpServers section of your claude_desktop_config.json file. If the mcpServers section doesn't exist, you can create it.
{
"mcpServers": {
"gemini-collaboration": {
"command": "python3",
"args": ["/Users/jamiearonson/Documents/claude-gemini-integration/mcp_server.py"],
"env": {
"MCP_PORT": "8080"
}
}
}
}
Important: Ensure the args path points to the absolute path of mcp_server.py on your system.
6. Restart Claude Desktop
For the configuration changes to take effect, you must close and reopen the Claude Desktop application.
7. Verify MCP Tools are Available
After restarting Claude Desktop, in a new Claude Code session, you should see these tools available:
start_gemini_collaboration: Start a new collaborative conversation with Gemini.consult_gemini: Ask Gemini a question in the current collaboration context.get_collaboration_history: Get the full conversation history with Gemini.
Usage
Once set up, you can interact with Gemini directly through Claude Code using natural language.
Starting a Collaboration
To begin a new collaborative session with Gemini, simply tell Claude Code:
"Start a collaboration with Gemini about designing a REST API."
Claude Code will automatically use the start_gemini_collaboration tool.
Having a Discussion
Once a collaboration is active, you can ask Gemini questions or provide further context:
"Ask Gemini what they think about using microservices vs monolith for this project."
Claude Code will use the consult_gemini tool to get Gemini's input, maintaining the conversation context.
Getting History
To review the entire conversation history of the current collaboration:
"Show me the collaboration history with Gemini."
This will use the get_collaboration_history tool.
Example Workflow
Here's a typical interaction flow:
- You: "I need to design a database schema for an e-commerce site. Start a collaboration with Gemini."
- Claude: (Uses
start_gemini_collaborationtool, confirms collaboration started) - You: "Ask Gemini about the best approach for handling product variants (e.g., size, color)."
- Claude: (Uses
consult_geminitool, displays Gemini's response) - You: "What does Gemini think about our payment table design, specifically regarding PCI compliance?"
- Claude: (Continues the collaboration, building on previous context, and provides Gemini's insights)
- You: "Show me the full collaboration history."
- Claude: (Displays the entire conversation transcript)
Troubleshooting
Tools Not Available in Claude Code
- Ensure Claude Desktop was fully restarted after modifying
claude_desktop_config.json. - Verify that the
claude_desktop_config.jsonfile exists at the correct path for your operating system and that the JSON is valid. - Double-check that the
commandandargspaths inclaude_desktop_config.jsonare absolute and correct.
Collaborative Server Connection Issues
- Check if the server is running:
If it's not running, restart it as described in Step 4.curl http://localhost:8080/health - Check for port conflicts: Ensure no other application is using port
8080.
Gemini CLI Issues
- Verify Gemini CLI installation: Run
gemini --versionin your terminal. - Check API key configuration: Ensure your Gemini API key is correctly set up and accessible by the Gemini CLI.
Testing
You can run the provided test script to verify the MCP client and server functionality:
# Navigate to the project directory
cd /Users/jamiearonson/Documents/claude-gemini-integration
# Activate your virtual environment
source venv/bin/activate
# Run the tests
python test_mcp.py
Benefits of Collaboration
This integration provides significant benefits for complex software engineering tasks:
- Real-time Discussion: Engage in actual back-and-forth conversations with Gemini, not just one-off queries.
- Contextual Understanding: Gemini maintains context throughout the conversation, leading to more relevant and insightful responses.
- Iterative Problem Solving: Work through problems step-by-step, refining ideas and solutions collaboratively.
- Enhanced Problem Solving: Leverage the strengths of both Claude and Gemini to tackle challenging problems more effectively.
- Seamless Workflow: Integrate collaborative AI assistance directly into your Claude Code environment.
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
E2B
Using MCP to run code via e2b.