mcp-confluent

mcp-confluent

An MCP server implementation built to interact with Confluent Kafka and Confluent Cloud REST APIs.

Category
Visit Server

README

mcp-confluent

An MCP server implementation built to interact with Confluent Kafka and Confluent Cloud REST APIs.

Demo

Goose CLI

Goose CLI Demo

Claude Desktop

Claude Desktop Demo

Table of Contents

User Guide

Getting Started

  1. Create a .env file: Copy the example .env file structure (shown below) into a new file named .env in the root of your project.

  2. Populate the .env file: Fill in the necessary values for your Confluent Cloud environment. See the Configuration section for details on each variable.

  3. Install Node.js (if not already installed)

    • We recommend using NVM (Node Version Manager) to manage Node.js versions
    • Install and use Node.js:
    nvm install 22
    nvm use 22
    

Configuration

# .env file
BOOTSTRAP_SERVERS="pkc-v12gj.us-east4.gcp.confluent.cloud:9092"
KAFKA_API_KEY="..."
KAFKA_API_SECRET="..."
KAFKA_REST_ENDPOINT="https://pkc-v12gj.us-east4.gcp.confluent.cloud:443"
KAFKA_CLUSTER_ID=""
KAFKA_ENV_ID="env-..."
FLINK_ENV_ID="env-..."
FLINK_ORG_ID=""
FLINK_REST_ENDPOINT="https://flink.us-east4.gcp.confluent.cloud"
FLINK_ENV_NAME=""
FLINK_DATABASE_NAME=""
FLINK_API_KEY=""
FLINK_API_SECRET=""
FLINK_COMPUTE_POOL_ID="lfcp-..."
CONFLUENT_CLOUD_API_KEY=""
CONFLUENT_CLOUD_API_SECRET=""
CONFLUENT_CLOUD_REST_ENDPOINT="https://api.confluent.cloud"
SCHEMA_REGISTRY_API_KEY="..."
SCHEMA_REGISTRY_API_SECRET="..."
SCHEMA_REGISTRY_ENDPOINT="https://psrc-zv01y.northamerica-northeast2.gcp.confluent.cloud"
Variable Description Default Value Required
BOOTSTRAP_SERVERS List of Kafka broker addresses in the format host1:port1,host2:port2 used to establish initial connection to the Kafka cluster (string) Yes
CONFIG_PATH File system path to store and retrieve conversation-based configurations for session persistence (Future Implementation) (string) Yes
CONFLUENT_CLOUD_API_KEY Master API key for Confluent Cloud platform administration, enabling management of resources across your organization (string (min: 1)) Yes
CONFLUENT_CLOUD_API_SECRET Master API secret paired with CONFLUENT_CLOUD_API_KEY for comprehensive Confluent Cloud platform administration (string (min: 1)) Yes
FLINK_API_KEY Authentication key for accessing Confluent Cloud's Flink services, including compute pools and SQL statement management (string (min: 1)) Yes
FLINK_API_SECRET Secret token paired with FLINK_API_KEY for authenticated access to Confluent Cloud's Flink services (string (min: 1)) Yes
KAFKA_API_KEY Authentication credential (username) required to establish secure connection with the Kafka cluster (string (min: 1)) Yes
KAFKA_API_SECRET Authentication credential (password) paired with KAFKA_API_KEY for secure Kafka cluster access (string (min: 1)) Yes
SCHEMA_REGISTRY_API_KEY Authentication key for accessing Schema Registry services to manage and validate data schemas (string (min: 1)) Yes
SCHEMA_REGISTRY_API_SECRET Authentication secret paired with SCHEMA_REGISTRY_API_KEY for secure Schema Registry access (string (min: 1)) Yes
CONFLUENT_CLOUD_REST_ENDPOINT Base URL for Confluent Cloud's REST API services (default) No
FLINK_COMPUTE_POOL_ID Unique identifier for the Flink compute pool, must start with 'lfcp-' prefix (string) No
FLINK_DATABASE_NAME Name of the associated Kafka cluster used as a database reference in Flink SQL operations (string (min: 1)) No
FLINK_ENV_ID Unique identifier for the Flink environment, must start with 'env-' prefix (string) No
FLINK_ENV_NAME Human-readable name for the Flink environment used for identification and display purposes (string (min: 1)) No
FLINK_ORG_ID Organization identifier within Confluent Cloud for Flink resource management (string (min: 1)) No
FLINK_REST_ENDPOINT Base URL for Confluent Cloud's Flink REST API endpoints used for SQL statement and compute pool management (string) No
KAFKA_CLUSTER_ID Unique identifier for the Kafka cluster within Confluent Cloud ecosystem (string (min: 1)) No
KAFKA_ENV_ID Environment identifier for Kafka cluster, must start with 'env-' prefix (string) No
KAFKA_REST_ENDPOINT REST API endpoint for Kafka cluster management and administration (string) No
SCHEMA_REGISTRY_ENDPOINT URL endpoint for accessing Schema Registry services to manage data schemas (string) No

Usage

This MCP server is designed to be used with various MCP clients, such as Claude Desktop or Goose CLI/Desktop. The specific configuration and interaction will depend on the client you are using. However, the general steps are:

  1. Build: Follow the instructions in the Developer Guide to build and run the server from source. This typically involves:

    • Installing dependencies (npm install)
    • Building the project (npm run build or npm run dev)
  2. Configure your MCP Client: Each client will have its own way of specifying the MCP server's address and any required credentials. You'll need to configure your client (e.g., Claude, Goose) to connect to the address where this server is running (likely localhost with a specific port). The port the server runs on may be configured by an environment variable.

  3. Start the MCP Client: Once your client is configured to connect to the MCP server, you can start your mcp client and on startup - it will stand up an instance of this MCP server locally. This instance will be responsible for managing data schemas and interacting with Confluent Cloud on your behalf.

  4. Interact with Confluent through the Client: Once the client is connected, you can use the client's interface to interact with Confluent Cloud resources. The client will send requests to this MCP server, which will then interact with Confluent Cloud on your behalf.

Configuring Claude Desktop

See here for more details about installing claude desktop, mcp servers.

To configure Claude Desktop to use this MCP server:

  1. Open Claude Desktop Configuration

    • On Mac: ~/Library/Application Support/Claude/claude_desktop_config.json
    • On Windows: %APPDATA%\Claude\claude_desktop_config.json
  2. Edit Configuration File

    • Open the config file in your preferred text editor
    • Add or modify the configuration to include the MCP server:
    {
      "mcpServers": {
        "confluent": {
          "command": "node",
          "args": [
            "--env-file",
            "/path/to/confluent-mcp-server/.env",
            "/path/to/confluent-mcp-server/dist/index.js"
          ]
        }
      }
    }
    

    Replace /path/to/confluent-mcp-server/ with the actual path where you've installed this MCP server.

  3. Restart Claude Desktop

    • Close and reopen Claude Desktop for the changes to take effect
    • The MCP server will automatically start when Claude Desktop launches

Now Claude Desktop will be configured to use your local MCP server for Confluent interactions.

Claude Tools

Configuring Goose CLI

See here for detailed instructions on how to install the Goose CLI. Once installed, you can add this server as an extension

Once you have it installed, run the configuration command goose configure and select the following

Add extension>Command-line Extension>mcp-confluent>node --env-file /path/to/confluent-mcp-server/.env /path/to/confluent-mcp-server/dist/index.js>No

Replace /path/to/confluent-mcp-server/ with the actual path where you've installed this MCP server.

Goose Configure

Developer Guide

Project Structure

/
├── src/                 # Source code
│   ├── confluent/       # Code related to Confluent integration (API clients, etc.)
│   ├── tools/           # Tool implementations (each tool in a separate file)
│   ├── index.ts         # Main entry point for the server
│   └── ...              # Other server logic, utilities, etc.
├── dist/                # Compiled output (TypeScript -> JavaScript)
├── openapi.json         # OpenAPI specification for Confluent Cloud
├── .env                 # Environment variables (example - should be copied and filled)
├── README.md            # This file
└── package.json         # Node.js project metadata and dependencies

Building and Running

  1. Install Dependencies:

    npm install
    
  2. Development Mode (watch for changes):

    npm run dev
    

    This command compiles the TypeScript code to JavaScript and automatically rebuilds when changes are detected in the src/ directory.

  3. Production Build (one-time compilation):

    npm run build
    
  4. Start the Server:

    npm run start
    

Testing

MCP Inspector

For testing MCP servers, you can use MCP Inspector which is an interactive developer tool for testing and debugging MCP servers.

# make sure you've already built the project either in dev mode or by running npm run build
npx @modelcontextprotocol/inspector node --env-file $PATH_TO_PROJECT/.env /$PATH_TO_PROJECT/dist/index.js

Adding a New Tool

  1. Add a new enum to the enum class ToolName.
  2. Add your new tool to the handlers map in the ToolFactory class.
  3. Create a new file, exporting the class that extends BaseToolHandler.
    1. Implement the handle method of the base class.
    2. Implement the getToolConfig method of the base class.
  4. Once satisfied, add it to the set of enabledTools in index.ts.

Generating Types

# as of v7.5.2 there is a bug when using allOf w/ required https://github.com/openapi-ts/openapi-typescript/issues/1474. need --empty-objects-unknown flag to avoid it
npx openapi-typescript ./openapi.json -o ./src/confluent/openapi-schema.d.ts --empty-objects-unknown

Contributing

<!-- Will fill in later. -->

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
E2B

E2B

Using MCP to run code via e2b.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured