Mcp Server
Model Context Protocol implementation that connects your LLM to Firebolt
firebolt-db
README
<h1 align="center"> <br> <a href="https://www.firebolt.io"><img src="https://cdn.prod.website-files.com/5e8a264ceaf4870394477fc7/5e8a264ceaf4879f75477fdd_logo_website.svg" alt="Firebolt" width="300"></a> <br> MCP Server <br> </h1>
<h4 align="center"> A Model Context Protocol implementation that connects your LLM to Firebolt Data Warehouse </h4>
<p align="center"> <a href="#key-features">Key Features</a> | <a href="#how-to-use">How To Use</a> | <a href="#connecting-your-llm">Connecting Your LLM</a> | <a href="#architecture">Architecture</a> | <a href="#development">Development</a> </p>
Key Features
LLM Integration with Firebolt
- Connect your AI assistants directly to your data warehouse
- Enable AI agents to autonomously query data and generate insights
- Provide LLMs with deep knowledge of Firebolt SQL, features, and documentation
SQL Query Execution
- Support for multiple query types and execution modes
- Direct access to Firebolt databases
Documentation Access
- Grant LLMs access to comprehensive Firebolt docs, SQL reference, function lists, and more
Account Management
- Seamless authentication with Firebolt service accounts
- Connect to different engines and workspaces
Multi-platform Support
- Runs anywhere Go binaries are supported
- Official Docker image available for easy deployment
How To Use
Before you start, ensure you have a Firebolt service account with a client ID and client secret.
Installing the MCP Server
You can run the Firebolt MCP Server either via Docker or by downloading the binary.
Option 1: Run with Docker
docker run \
--rm \
-e FIREBOLT_MCP_CLIENT_ID=your-client-id \
-e FIREBOLT_MCP_CLIENT_SECRET=your-client-secret \
ghcr.io/firebolt-db/mcp-server:0.2.0
Option 2: Run the Binary
# Download the binary for your OS from:
# https://github.com/firebolt-db/mcp-server/releases/tag/v0.2.0
./firebolt-mcp-server \
--client-id your-client-id \
--client-secret your-client-secret
Connecting Your LLM
Once the MCP Server is installed, you can connect various LLM clients.
Below are integration examples for Claude Desktop. For other clients like VSCode Copilot Chat and Cursor, please refer to their official documentation.
Claude Desktop
To integrate with Claude Desktop using Docker:
-
Open the Claude menu and select Settings….
-
Navigate to Developer > Edit Config.
-
Update the configuration file (
claude_desktop_config.json) to include:{ "mcpServers": { "firebolt": { "command": "docker", "args": [ "run", "-i", "--rm", "-e", "FIREBOLT_MCP_CLIENT_ID=your-client-id", "-e", "FIREBOLT_MCP_CLIENT_SECRET=your-client-secret", "ghcr.io/firebolt-db/mcp-server:0.2.0" ] } } }To use the binary instead of Docker:
{ "mcpServers": { "firebolt": { "command": "/path/to/firebolt-mcp-server", "env": { "FIREBOLT_MCP_CLIENT_ID": "your-client-id", "FIREBOLT_MCP_CLIENT_SECRET": "your-client-secret" } } } } -
Save the config and restart Claude Desktop.
More details: Claude MCP Quickstart Guide
GitHub Copilot Chat (VSCode)
To integrate MCP with Copilot Chat in VSCode, refer to the official documentation:
👉 Extending Copilot Chat with the Model Context Protocol
Cursor Editor
To set up MCP in Cursor, follow their guide:
👉 Cursor Documentation on Model Context Protocol
Architecture
Firebolt MCP Server implements the Model Context Protocol, providing:
-
Tools - Task-specific capabilities provided to the LLM:
firebolt_docs: Access Firebolt documentationfirebolt_connect: Establish connections to Firebolt engines and databasesfirebolt_query: Execute SQL queries against Firebolt
-
Resources - Data that can be referenced by the LLM:
- Documentation articles
- Lists of Accounts, Databases, Engines
-
Prompts - Predefined instructions for the LLM:
- Firebolt Expert: Prompts the model to act as a Firebolt specialist
Development
To set up the development environment:
# Clone this repository
git clone https://github.com/firebolt-db/mcp-server.git
# Go into the repository
cd mcp-server
# Install Task (if you don't have it already)
go install github.com/go-task/task/v3/cmd/task@latest
# Update Go dependencies
task mod
# Build the application
task build
# Run the tests
tast test
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
MCP Package Docs Server
Facilitates LLMs to efficiently access and fetch structured documentation for packages in Go, Python, and NPM, enhancing software development with multi-language support and performance optimization.
Claude Code MCP
An implementation of Claude Code as a Model Context Protocol server that enables using Claude's software engineering capabilities (code generation, editing, reviewing, and file operations) through the standardized MCP interface.
@kazuph/mcp-taskmanager
Model Context Protocol server for Task Management. This allows Claude Desktop (or any MCP client) to manage and execute tasks in a queue-based system.
Linear MCP Server
Enables interaction with Linear's API for managing issues, teams, and projects programmatically through the Model Context Protocol.
mermaid-mcp-server
A Model Context Protocol (MCP) server that converts Mermaid diagrams to PNG images.
Jira-Context-MCP
MCP server to provide Jira Tickets information to AI coding agents like Cursor
Linear MCP Server
A Model Context Protocol server that integrates with Linear's issue tracking system, allowing LLMs to create, update, search, and comment on Linear issues through natural language interactions.
Sequential Thinking MCP Server
This server facilitates structured problem-solving by breaking down complex issues into sequential steps, supporting revisions, and enabling multiple solution paths through full MCP integration.