Keboola Explorer MCP Server

Keboola Explorer MCP Server

This server facilitates interaction with Keboola's Storage API, enabling users to browse and manage project buckets, tables, and components efficiently through Claude Desktop.

Category
Visit Server

Tools

query_table

Executes an SQL SELECT query to get the data from the underlying snowflake database. * When constructing the SQL SELECT query make sure to use the fully qualified table names that include the database name, schema name and the table name. * The fully qualified table name can be found in the table information, use a tool to get the information about tables. The fully qualified table name can be found in the response for that tool. * Snowflake is case-sensitive so always wrap the column names in double quotes. Examples: * SQL queries must include the fully qualified table names including the database name, e.g.: SELECT * FROM "db_name"."db_schema_name"."table_name";

list_bucket_tables

List all tables in a specific bucket with their basic information.

get_table_metadata

Get detailed information about a specific table including its DB identifier and column information.

get_bucket_metadata

Get detailed information about a specific bucket.

list_bucket_info

List information about all buckets in the project.

list_components

List all available components and their configurations.

list_component_configs

List all configurations for a specific component.

README

Keboola MCP Server

CI codecov <a href="https://glama.ai/mcp/servers/72mwt1x862"><img width="380" height="200" src="https://glama.ai/mcp/servers/72mwt1x862/badge" alt="Keboola Explorer Server MCP server" /></a> smithery badge

A Model Context Protocol (MCP) server for interacting with Keboola Connection. This server provides tools for listing and accessing data from Keboola Storage API.

Requirements

  • Python 3.10 or newer
  • Keboola Storage API token
  • Snowflake or BigQuery Read Only Workspace

Installation

Installing via Pip

First, create a virtual environment and then install the keboola_mcp_server package:

python3 -m venv --upgrade-deps .venv
source .venv/bin/activate

pip3 install keboola_mcp_server

Installing via Smithery

To install Keboola MCP Server for Claude Desktop automatically via Smithery:

npx -y @smithery/cli install keboola-mcp-server --client claude

Claude Desktop Setup

To use this server with Claude Desktop, follow these steps:

  1. Create or edit the Claude Desktop configuration file:

    • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
    • Windows: %APPDATA%\Claude\claude_desktop_config.json
  2. Add the following configuration (adjust paths according to your setup):

{
  "mcpServers": {
    "keboola": {
      "command": "/path/to/keboola-mcp-server/.venv/bin/python",
      "args": [
        "-m",
        "keboola_mcp_server",
        "--api-url",
        "https://connection.YOUR_REGION.keboola.com"
      ],
      "env": {
        "KBC_STORAGE_TOKEN": "your-keboola-storage-token",
        "KBC_WORKSPACE_SCHEMA": "your-workspace-schema"
      }
    }
  }
}

Replace:

  • /path/to/keboola-mcp-server with your actual path to the cloned repository
  • YOUR_REGION with your Keboola region (e.g., north-europe.azure, etc.). You can remove it if your region is just connection explicitly
  • your-keboola-storage-token with your Keboola Storage API token
  • your-workspace-schema with your Snowflake schema or BigQuery dataset of your workspace

Note: If you are using a specific version of Python (e.g. 3.11 due to some package compatibility issues), you'll need to update the command into using that specific version, e.g. /path/to/keboola-mcp-server/.venv/bin/python3.11

Note: The Workspace can be created in your Keboola project. It is the same project where you got your Storage Token. The workspace will provide all the necessary connection parameters including the schema or dataset name.

  1. After updating the configuration:
    • Completely quit Claude Desktop (don't just close the window)
    • Restart Claude Desktop
    • Look for the hammer icon in the bottom right corner, indicating the server is connected

Troubleshooting

If you encounter connection issues:

  1. Check the logs in Claude Desktop for any error messages
  2. Verify your Keboola Storage API token is correct
  3. Ensure all paths in the configuration are absolute paths
  4. Confirm the virtual environment is properly activated and all dependencies are installed

Cursor AI Setup

To use this server with Cursor AI, you have two options for configuring the transport method: Server-Sent Events (SSE) or Standard I/O (stdio).

  1. Create or edit the Cursor AI configuration file:

    • Location: ~/.cursor/mcp.json
  2. Add one of the following configurations (or all) based on your preferred transport method:

Option 1: Using Server-Sent Events (SSE)

{
  "mcpServers": {
    "keboola": {
      "url": "http://localhost:8000/sse?storage_token=YOUR-KEBOOLA-STORAGE-TOKEN&workspace_schema=YOUR-WORKSPACE-SCHEMA"
    }
  }
}

Option 2a: Using Standard I/O (stdio)

{
  "mcpServers": {
    "keboola": {
      "command": "/path/to/keboola-mcp-server/.venv/bin/python",
      "args": [
        "-m",
        "keboola_mcp_server",
        "--transport",
        "stdio",
         "--api-url",
         "https://connection.YOUR_REGION.keboola.com"
      ],
      "env": {
        "KBC_STORAGE_TOKEN": "your-keboola-storage-token", 
        "KBC_WORKSPACE_SCHEMA": "your-workspace-schema"         
      }
    }
  }
}

Option 2b: Using WSL Standard I/O (wsl stdio)

When running the MCP server from Windows Subsystem for Linux with Cursor AI, use this.

{
  "mcpServers": {
    "keboola": {
      "command": "wsl.exe",
      "args": [
        "bash",
        "-c",
        "'source /wsl_path/to/keboola-mcp-server/.env",
        "&&",
        "/wsl_path/to/keboola-mcp-server/.venv/bin/python -m keboola_mcp_server.cli --transport stdio'"
      ]
    }
  }
}
  • where /wsl_path/to/keboola-mcp-server/.env file contains environment variables:
export KBC_STORAGE_TOKEN="your-keboola-storage-token"
export KBC_WORKSPACE_SCHEMA="your-workspace-schema"

Replace:

  • /path/to/keboola-mcp-server with your actual path to the cloned repository
  • YOUR_REGION with your Keboola region (e.g., north-europe.azure, etc.). You can remove it if your region is just connection explicitly
  • your-keboola-storage-token with your Keboola Storage API token
  • your-workspace-schema with your Snowflake schema or BigQuery dataset of your workspace

After updating the configuration:

  1. Restart Cursor AI
  2. If you use the sse transport make sure to start your MCP server. You can do so by running this in the activated virtual environment where you built the server:
    /path/to/keboola-mcp-server/.venv/bin/python -m keboola_mcp_server --transport sse --api-url https://connection.YOUR_REGION.keboola.com
    
  3. Cursor AI should be automatically detect your MCP server and enable it.

BigQuery support

If your Keboola project uses BigQuery backend you will need to set GOOGLE_APPLICATION_CREDENTIALS environment variable in addition to KBC_STORAGE_TOKEN and KBC_WORKSPACE_SCHEMA.

  1. Go to your Keboola BigQuery workspace and display its credentials (click Connect button).
  2. Download the credentials file to your local disk. It is a plain JSON file.
  3. Set the full path of the downloaded JSON credentials file to GOOGLE_APPLICATION_CREDENTIALS environment variable.

This will give your MCP server instance permissions to access your BigQuery workspace in Google Cloud.

Available Tools

The server provides the following tools for interacting with Keboola Connection:

  • List buckets and tables
  • Get bucket and table information
  • Preview table data
  • Export table data to CSV
  • List components and configurations

Development

Run tests:

pytest

Format code:

black .
isort .

Type checking:

mypy .

License

MIT License - see LICENSE file for details.

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
E2B

E2B

Using MCP to run code via e2b.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured