Greeter MCP Server
A simple demonstration MCP server that provides a greeting functionality when integrated with Gemini CLI.
README
MCP Example
Quick and simple repo to demonstrate the very basics MCP and Gemini CLI. Nothing more.
Quick Setup
Quick setup of a new project (using uv):
curl -LsSf https://astral.sh/uv/install.sh | sh
(linux and mac)
- Clone project and initialize virtualenv
git clone https://github.com/jrmlhermitte/gemini-mcp-example.git
cd gemini-mcp-example
uv sync
source .venv/bin/activate
Write MCP Server And Test
- The file we'll run is in
gemini-mcp-example/main.pyand already defined. Take a look at it. The main components are
# ...
mcp = FastMCP("greeter")
# ...
@mcp.tool()
def greet(name: str) -> str:
return f'Hello {name}!'
# ...
if __name__ == "__main__":
# NOTE: stdio is the default.
mcp.run(transport='stdio')
- Run file
(Don't forget to activate your virtual env source .venv/bin/activate)
python gemini-mcp-example/main.py
- Init communication
We're going to initialize the 2024-11-05 protocol version using stdin/stdout (the stdio protocol which we setup our fast MCP server to use).
Paste this exactly into your shell:
{"jsonrpc":"2.0","id":1,"method":"initialize","params":{"protocolVersion":"2024-11-05","capabilities":{"roots":{"listChanged":true},"tools":{"listChanged":true},"sampling":{},"elicitation":{}},"clientInfo":{"name":"ExampleClient","title":"ExampleClientDisplayName","version":"1.0.0"}}}
You should see:
{"jsonrpc":"2.0","id":1,"result":{"protocolVersion":"2024-11-05","capabilities":{"experimental":{},"prompts":{"listChanged":false},"resources":{"subscribe":false,"listChanged":false},"tools":{"listChanged":false}},"serverInfo":{"name":"greeter","version":"1.10.1"}}}
NOTE: The json commands here and below must be pasted as is. You cannot have newlines in between. If the formatting is incorrect, the server will just ignore your requests.
When you do, paste this to start the connection:
{"jsonrpc":"2.0","method":"notifications/initialized"}
Now type this to list available tools:
{"jsonrpc":"2.0","method":"tools/list","id":1}
you should see something like this (you may see additional logging):
{"jsonrpc":"2.0","id":1,"result":{"tools":[{"name":"greet","description":"","inputSchema":{"properties":{"name":{"title":"Name","type":"string"}},"required":["name"],"title":"greetArguments","type":"object"},"outputSchema":{"properties":{"result":{"title":"Result","type":"string"}},"required":["result"],"title":"greetOutput","type":"object"}}]}}
Congratulations! You have successfully started a Stdio connection with an MCP server! Now test calling your tool:
{"jsonrpc":"2.0","id":1,"method":"tools/call","params":{"name":"greet","arguments":{"name":"Teal'c"}}}
you should then see:
{"jsonrpc":"2.0","id":1,"result":{"content":[{"type":"text","text":"Hello Teal'c!"}],"structuredContent":{"result":"Hello Teal'c!"},"isError":false}}
This is how you're going to setup an MCP server with Gemini.
Gemini CLI will run your server as a child process and send commands to stdin and receive responses from stdout using the stdio protocol.
Gemini CLI
Integrating with Gemini CLI.
npm install -g @google/gemini-cli
- Add the Gemini extension from here (docs):
(NOTE: This should be run from the root of this github repo)
mkdir -p ~/gemini/extensions
ln -s $PWD/gemini-mcp-example ~/.gemini/extensions
- Start gemini and list mcp servers
gemini
Then type:
/mcp
You should see this:

NOTE: You must start gemini from the code folder. The reason is that the
extension runs python ./gemini-mcp-example/main.py. If you want to make this runnable from everywhere, you'll need to make sure your base python environment contains the fastmcp library and that the gemini-extension.json refers to an absolute path.
NOTE: If this is your first time setting up Gemini CLI, you will also see some easy to follow setup steps.
- Give it your name. It will likely try to call your tools.
Input something like:
My name is Teal'c
Gemini should figure that it might want to call the greeting tool, given you've introduced yourself. You should get a request to call the tool:

And it should hopefully have called the tool.

Troubleshooting
Running into problems? Try running the mcp server yourself to see if it's able to start up:
source .venv/bin/activate
python gemini-extension/main.py
(Also don't forget to run source .venv/bin/activate before starting gemini; We're running this in a local virtual environment here.)
Where to go from here?
This demonstrates how easy it is to setup an MCP server and integrate it with Gemini. You should be able to have a basic enough understanding to integrate it with your own tools now!
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.