Lenses MCP Server

Lenses MCP Server

Manage, explore, transform and join data in Kafka topics across multiple clusters using different flavours of Apache Kafka via Lenses.io (including the free Community Edition)

Category
Visit Server

README

🌊🔍 Lenses MCP Server 🔎🌊

This is the MCP (Model Context Protocol) server for Lenses, a self-service DataOps tool for engineers building real-time applications with different flavours of Apache Kafka across multiple clusters. Explore, transform and join data in topics from different clusters using SQL, without the need for an additional database.

Try this with the free Lenses Community Edition (restricted by number of users and enterprise features, e.g. OAuth). Requires Lenses v6+.

Table of Contents

1. Install uv and Python

We use uv for dependency management and project setup. If you don't have uv installed, follow the official installation guide.

This project has been built using Python 3.12 and to make sure Python is correctly installed, run the following command to check the version.

uv run python --version

2. Configure Environment Variables

Copy the example environment file.

cp .env.example .env

Open .env and fill in the required values such as your Lenses instance details and Lenses API key.

3. Add Lenses API Key

Create a Lenses API key by creating an IAM Service Account. Add the API key to .env with the variable name, LENSES_API_KEY.

4. Install Dependencies and Run the Server

Use uv to create a virtual environment, install the project dependencies in it and then run the MCP server with the FastMCP CLI using the default stdio transport.

uv sync
uv run src/lenses_mcp/server.py

To run as a remote server, use the http transport.

uv run fastmcp run src/lenses_mcp/server.py --transport=http --port=8000

To run in Claude Desktop, Gemini CLI, Cursor, etc. use the following JSON configuration.

{
  "mcpServers": {
    "Lenses.io": {
      "command": "uv",
      "args": [
        "run",
        "--project", "<ABSOLUTE_PATH_TO_THIS_REPO>",
        "--with", "fastmcp",
        "fastmcp",
        "run",
        "<ABSOLUTE_PATH_TO_THIS_REPO>/src/lenses_mcp/server.py"
      ],
      "env": {
        "LENSES_API_KEY": "<YOUR_LENSES_API_KEY>"
      },
      "transport": "stdio"
    }
  }
}

Note: Some clients may require the absolute path to uv in the command.

5. Optional Context7 MCP Server

Lenses documentation is available on Context7. It is optional but highly recommended to use the Context7 MCP Server and adjust your prompts with use context7 to ensure the documentation available to the LLM is up to date.

6. Running with Docker

The Lenses MCP server is available as a Docker image at lensesio/mcp. You can run it with different transport modes depending on your use case.

Quick Start

Run the server with stdio transport (default):

docker run \
   -e LENSES_API_KEY=<YOUR_API_KEY> \
   -e LENSES_URL=http://localhost:9991 \
   lensesio/mcp

Run the server with HTTP transport (listens on http://0.0.0.0:8000/mcp):

docker run -p 8000:8000 \
   -e LENSES_API_KEY=<YOUR_API_KEY> \
   -e LENSES_URL=http://localhost:9991 \
   -e TRANSPORT=http \
   lensesio/mcp

Run the server with SSE transport (listens on http://0.0.0.0:8000/sse):

docker run -p 8000:8000 \
   -e LENSES_API_KEY=<YOUR_API_KEY> \
   -e LENSES_URL=http://localhost:9991 \
   -e TRANSPORT=sse \
   lensesio/mcp

Environment Variables

Variable Required Default Description
LENSES_API_KEY Yes - Your Lenses API key (create via IAM Service Account)
LENSES_URL No http://localhost:9991 Lenses instance URL in format [scheme]://[host]:[port]. Use https:// for secure connections (automatically uses wss:// for WebSockets)
TRANSPORT No stdio Transport mode: stdio, http, or sse
PORT No 8000 Port to listen on (only used with http or sse transport)

Legacy environment variables (for backward compatibility):

  • LENSES_API_HTTP_URL, LENSES_API_HTTP_PORT
  • LENSES_API_WEBSOCKET_URL, LENSES_API_WEBSOCKET_PORT

These are automatically derived from LENSES_URL but can be explicitly set to override.

Transport Endpoints

  • stdio: Standard input/output (no network endpoint)
  • http: HTTP endpoint at /mcp
  • sse: Server-Sent Events endpoint at /sse

Building the Docker Image

To build the Docker image locally:

docker build -t lensesio/mcp .

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
E2B

E2B

Using MCP to run code via e2b.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured