Gemini Streamable HTTP MCP
A Docker-based deployment that wraps the Gemini MCP server with supergateway to expose its tool capabilities over Streamable HTTP. It enables users to perform image-editing workflows and interact with the Google Gemini API using a standardized web-accessible endpoint.
README
gemini-3-pro-mcp-streamable-http-mcp
A Docker-based deployment that wraps the RLabs-Inc/gemini-mcp stdio MCP server with supergateway to expose it over Streamable HTTP. Includes a Python test suite that validates the full image-edit workflow end-to-end.
Architecture
Client (MCP SDK)
│
│ Streamable HTTP (POST/GET http://host:8000/mcp)
▼
┌────────────────────────────────────────────┐
│ Docker Container │
│ │
│ supergateway │
│ --stdio "gemini-mcp" │
│ --outputTransport streamableHttp │
│ --port 8000 │
│ │ │
│ │ stdio (stdin/stdout) │
│ ▼ │
│ @rlabs-inc/gemini-mcp │
│ (talks to Google Gemini API) │
└────────────────────────────────────────────┘
supergateway spawns gemini-mcp as a child process, speaks MCP over stdio with it, and re-exposes every tool over a single Streamable HTTP endpoint at /mcp.
Prerequisites
- Docker
- A Google Gemini API key
- Python 3.10+ (for running the tests)
Quick Start
Build the image
docker build -t gemini-mcp .
Run the container
docker run -e GEMINI_API_KEY=your_key_here -p 8000:8000 gemini-mcp
The MCP server is now available at http://localhost:8000/mcp.
Connect from any MCP client
Any client that supports Streamable HTTP transport can connect. For example, with the MCP Python SDK:
from mcp import ClientSession
from mcp.client.streamable_http import streamablehttp_client
async with streamablehttp_client("http://localhost:8000/mcp") as (read, write, _):
async with ClientSession(read, write) as session:
await session.initialize()
tools = await session.list_tools()
print([t.name for t in tools.tools])
File Reference
| File | Purpose |
|---|---|
Dockerfile |
Builds the container image: installs supergateway and @rlabs-inc/gemini-mcp globally on node:22-slim, then runs supergateway on port 8000 with Streamable HTTP output. |
test_gemini_mcp.py |
Pytest test suite that builds the image, starts a container, connects via the MCP Python SDK, and exercises the image-edit workflow. |
requirements.txt |
Python dependencies for the test suite. |
Dockerfile
FROM node:22-slim
RUN npm install -g supergateway @rlabs-inc/gemini-mcp
ENV GEMINI_API_KEY=""
EXPOSE 8000
CMD ["supergateway", "--stdio", "gemini-mcp", \
"--outputTransport", "streamableHttp", "--port", "8000"]
| Layer | What it does |
|---|---|
node:22-slim |
Minimal Node.js 22 runtime. |
npm install -g |
Pre-installs both packages so startup is fast (no npx download on every boot). |
ENV GEMINI_API_KEY |
Declares the variable; the actual key is injected at docker run time via -e. |
EXPOSE 8000 |
Documents the listening port. |
CMD |
Launches supergateway, which spawns gemini-mcp over stdio and bridges it to Streamable HTTP on port 8000. |
Test Suite
Dependencies
Install with:
pip install -r requirements.txt
| Package | Role |
|---|---|
mcp |
Official MCP Python SDK -- provides ClientSession and the Streamable HTTP transport client. |
pytest |
Test runner. |
pytest-asyncio |
Enables async def test functions. |
Pillow |
Image validation (format detection, integrity check via verify()). |
docker |
Python Docker SDK -- builds images, manages container lifecycle. |
Running the tests
GEMINI_API_KEY=your_key pytest test_gemini_mcp.py -v
The test suite is fully self-contained: it builds the Docker image, starts the container, runs the tests, and tears the container down afterward.
Fixtures
| Fixture | Scope | Description |
|---|---|---|
gemini_api_key |
session | Reads GEMINI_API_KEY from the environment. Skips all tests if unset. |
docker_client |
session | Creates a Docker SDK client from the local Docker daemon. |
gemini_container |
session | Builds the image, removes stale containers, starts a fresh container on port 18000, polls until the TCP port is accepting connections, yields the container, and force-removes it on teardown. |
Tests
test_list_tools
Connects to the MCP server and verifies that the expected image-edit tools are exposed:
gemini-start-image-editgemini-continue-image-editgemini-end-image-edit
test_image_edit_workflow
End-to-end image editing round-trip:
- Start session -- calls
gemini-start-image-editwith a prompt ("a simple red circle on a white background"), asserts anImageContentandTextContentblock are returned, extracts the session ID. - Apply edit -- calls
gemini-continue-image-editwith the session ID and an edit instruction ("change the red circle to a blue star"), asserts a new image is returned. - Download -- base64-decodes the image data and writes it to a temp file.
- Validate -- opens the bytes with Pillow, calls
verify()to check integrity, asserts width/height > 0 and format is PNG/JPEG/WEBP/GIF. Also validates the file written to disk. - Close session -- calls
gemini-end-image-editto clean up the server-side session.
Configuration constants
| Constant | Default | Purpose |
|---|---|---|
IMAGE_NAME |
gemini-mcp-test |
Docker image tag used during the test run. |
CONTAINER_NAME |
gemini-mcp-test-container |
Container name (allows cleanup of stale runs). |
HOST_PORT |
18000 |
Host port mapped to container port 8000. |
MCP_URL |
http://localhost:18000/mcp |
Full MCP endpoint URL. |
STARTUP_TIMEOUT |
120 |
Max seconds to wait for the container to accept TCP connections. |
MCP_CALL_TIMEOUT |
180 |
Read timeout (seconds) for individual MCP tool calls to the Gemini API. |
Environment Variables
| Variable | Required | Description |
|---|---|---|
GEMINI_API_KEY |
Yes | Google Gemini API key. Passed to the container at runtime. |
The @rlabs-inc/gemini-mcp package also supports optional variables (GEMINI_OUTPUT_DIR, GEMINI_TOOL_PRESET, GEMINI_PRO_MODEL, etc.) which can be passed through with additional -e flags on docker run.
License
See upstream projects for license terms:
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.