Supabase MCP Server

Supabase MCP Server
Featured

This server enables interaction with Supabase PostgreSQL databases through the MCP protocol, allowing seamless integration with Cursor and Windsurf IDEs for secure and validated database management.

alexander-zuev

Database Interaction
Data & App Analysis
Visit Server

Tools

get_schemas

List all database schemas with their sizes and table counts.

get_table_schema

Get detailed table structure including columns, keys, and relationships. Returns comprehensive information about a specific table's structure: - Column definitions (names, types, constraints) - Primary key information - Foreign key relationships - Indexes - Constraints - Triggers Parameters: - schema_name: Name of the schema (e.g., 'public', 'auth') - table: Name of the table to inspect SAFETY: This is a low-risk read operation that can be executed in SAFE mode.

execute_postgresql

Execute PostgreSQL statements against your Supabase database. IMPORTANT: All SQL statements must end with a semicolon (;). OPERATION TYPES AND REQUIREMENTS: 1. READ Operations (SELECT, EXPLAIN, etc.): - Can be executed directly without special requirements - Example: SELECT * FROM public.users LIMIT 10; 2. WRITE Operations (INSERT, UPDATE, DELETE): - Require UNSAFE mode (use live_dangerously('database', True) first) - Example: INSERT INTO public.users (email) VALUES ('user@example.com'); 3. SCHEMA Operations (CREATE, ALTER, DROP): - Require UNSAFE mode (use live_dangerously('database', True) first) - Destructive operations (DROP, TRUNCATE) require additional confirmation - Example: CREATE TABLE public.test_table (id SERIAL PRIMARY KEY, name TEXT); MIGRATION HANDLING: All queries that modify the database will be automatically version controlled by the server. You can provide optional migration name, if you want to name the migration. - Respect the following format: verb_noun_detail. Be descriptive and concise. - Examples: - create_users_table - add_email_to_profiles - enable_rls_on_users - If you don't provide a migration name, the server will generate one based on the SQL statement - The system will sanitize your provided name to ensure compatibility with database systems - Migration names are prefixed with a timestamp in the format YYYYMMDDHHMMSS SAFETY SYSTEM: Operations are categorized by risk level: - LOW RISK: Read operations (SELECT, EXPLAIN) - allowed in SAFE mode - MEDIUM RISK: Write operations (INSERT, UPDATE, DELETE) - require UNSAFE mode - HIGH RISK: Schema operations (CREATE, ALTER) - require UNSAFE mode - EXTREME RISK: Destructive operations (DROP, TRUNCATE) - require UNSAFE mode and confirmation TRANSACTION HANDLING: - DO NOT use transaction control statements (BEGIN, COMMIT, ROLLBACK) - The database client automatically wraps queries in transactions - The SQL validator will reject queries containing transaction control statements - This ensures atomicity and provides rollback capability for data modifications MULTIPLE STATEMENTS: - You can send multiple SQL statements in a single query - Each statement will be executed in order within the same transaction - Example: CREATE TABLE public.test_table (id SERIAL PRIMARY KEY, name TEXT); INSERT INTO public.test_table (name) VALUES ('test'); CONFIRMATION FLOW FOR HIGH-RISK OPERATIONS: - High-risk operations (DROP TABLE, TRUNCATE, etc.) will be rejected with a confirmation ID - The error message will explain what happened and provide a confirmation ID - Review the risks with the user before proceeding - Use the confirm_destructive_operation tool with the provided ID to execute the operation IMPORTANT GUIDELINES: - The database client starts in SAFE mode by default for safety - Only enable UNSAFE mode when you need to modify data or schema - Never mix READ and WRITE operations in the same transaction - For destructive operations, be prepared to confirm with the confirm_destructive_operation tool WHEN TO USE OTHER TOOLS INSTEAD: - For Auth operations (users, authentication, etc.): Use call_auth_admin_method instead of direct SQL The Auth Admin SDK provides safer, validated methods for user management - For project configuration, functions, storage, etc.: Use send_management_api_request The Management API handles Supabase platform features that aren't directly in the database Note: This tool operates on the PostgreSQL database only. API operations use separate safety controls.

get_tables

List all tables, foreign tables, and views in a schema with their sizes, row counts, and metadata. Provides detailed information about all database objects in the specified schema: - Table/view names - Object types (table, view, foreign table) - Row counts - Size on disk - Column counts - Index information - Last vacuum/analyze times Parameters: - schema_name: Name of the schema to inspect (e.g., 'public', 'auth', etc.) SAFETY: This is a low-risk read operation that can be executed in SAFE mode.

retrieve_migrations

Retrieve a list of all migrations a user has from Supabase. Returns a list of migrations with the following information: - Version (timestamp) - Name - SQL statements (if requested) - Statement count - Version type (named or numbered) Parameters: - limit: Maximum number of migrations to return (default: 50, max: 100) - offset: Number of migrations to skip for pagination (default: 0) - name_pattern: Optional pattern to filter migrations by name. Uses SQL ILIKE pattern matching (case-insensitive). The pattern is automatically wrapped with '%' wildcards, so "users" will match "create_users_table", "add_email_to_users", etc. To search for an exact match, use the complete name. - include_full_queries: Whether to include the full SQL statements in the result (default: false) SAFETY: This is a low-risk read operation that can be executed in SAFE mode.

send_management_api_request

Execute a Supabase Management API request. This tool allows you to make direct calls to the Supabase Management API, which provides programmatic access to manage your Supabase project settings, resources, and configurations. REQUEST FORMATTING: - Use paths exactly as defined in the API specification - The {ref} parameter will be automatically injected from settings - Format request bodies according to the API specification PARAMETERS: - method: HTTP method (GET, POST, PUT, PATCH, DELETE) - path: API path (e.g. /v1/projects/{ref}/functions) - path_params: Path parameters as dict (e.g. {"function_slug": "my-function"}) - use empty dict {} if not needed - request_params: Query parameters as dict (e.g. {"key": "value"}) - use empty dict {} if not needed - request_body: Request body as dict (e.g. {"name": "test"}) - use empty dict {} if not needed PATH PARAMETERS HANDLING: - The {ref} placeholder (project reference) is automatically injected - you don't need to provide it - All other path placeholders must be provided in the path_params dictionary - Common placeholders include: * {function_slug}: For Edge Functions operations * {id}: For operations on specific resources (API keys, auth providers, etc.) * {slug}: For organization operations * {branch_id}: For database branch operations * {provider_id}: For SSO provider operations * {tpa_id}: For third-party auth operations EXAMPLES: 1. GET request with path and query parameters: method: "GET" path: "/v1/projects/{ref}/functions/{function_slug}" path_params: {"function_slug": "my-function"} request_params: {"version": "1"} request_body: {} 2. POST request with body: method: "POST" path: "/v1/projects/{ref}/functions" path_params: {} request_params: {} request_body: {"name": "test-function", "slug": "test-function"} SAFETY SYSTEM: API operations are categorized by risk level: - LOW RISK: Read operations (GET) - allowed in SAFE mode - MEDIUM/HIGH RISK: Write operations (POST, PUT, PATCH, DELETE) - require UNSAFE mode - EXTREME RISK: Destructive operations - require UNSAFE mode and confirmation - BLOCKED: Some operations are completely blocked for safety reasons SAFETY CONSIDERATIONS: - By default, the API client starts in SAFE mode, allowing only read operations - To perform write operations, first use live_dangerously(service="api", enable=True) - High-risk operations will be rejected with a confirmation ID - Use confirm_destructive_operation with the provided ID after reviewing risks - Some operations may be completely blocked for safety reasons For a complete list of available API endpoints and their parameters, use the get_management_api_spec tool. For details on safety rules, use the get_management_api_safety_rules tool.

get_management_api_spec

Get the complete Supabase Management API specification. Returns the full OpenAPI specification for the Supabase Management API, including: - All available endpoints and operations - Required and optional parameters for each operation - Request and response schemas - Authentication requirements - Safety information for each operation This tool can be used in four different ways: 1. Without parameters: Returns all domains (default) 2. With path and method: Returns the full specification for a specific API endpoint 3. With domain only: Returns all paths and methods within that domain 4. With all_paths=True: Returns all paths and methods Parameters: - params: Dictionary containing optional parameters: - path: Optional API path (e.g., "/v1/projects/{ref}/functions") - method: Optional HTTP method (e.g., "GET", "POST") - domain: Optional domain/tag name (e.g., "Auth", "Storage") - all_paths: Optional boolean, if True returns all paths and methods Available domains: - Analytics: Analytics-related endpoints - Auth: Authentication and authorization endpoints - Database: Database management endpoints - Domains: Custom domain configuration endpoints - Edge Functions: Serverless function management endpoints - Environments: Environment configuration endpoints - OAuth: OAuth integration endpoints - Organizations: Organization management endpoints - Projects: Project management endpoints - Rest: RESTful API endpoints - Secrets: Secret management endpoints - Storage: Storage management endpoints This specification is useful for understanding: - What operations are available through the Management API - How to properly format requests for each endpoint - Which operations require unsafe mode - What data structures to expect in responses SAFETY: This is a low-risk read operation that can be executed in SAFE mode.

get_auth_admin_methods_spec

Get Python SDK methods specification for Auth Admin. Returns a comprehensive dictionary of all Auth Admin methods available in the Supabase Python SDK, including: - Method names and descriptions - Required and optional parameters for each method - Parameter types and constraints - Return value information This tool is useful for exploring the capabilities of the Auth Admin SDK and understanding how to properly format parameters for the call_auth_admin_method tool. No parameters required.

call_auth_admin_method

Call an Auth Admin method from Supabase Python SDK. This tool provides a safe, validated interface to the Supabase Auth Admin SDK, allowing you to: - Manage users (create, update, delete) - List and search users - Generate authentication links - Manage multi-factor authentication - And more IMPORTANT NOTES: - Request bodies must adhere to the Python SDK specification - Some methods may have nested parameter structures - The tool validates all parameters against Pydantic models - Extra fields not defined in the models will be rejected AVAILABLE METHODS: - get_user_by_id: Retrieve a user by their ID - list_users: List all users with pagination - create_user: Create a new user - delete_user: Delete a user by their ID - invite_user_by_email: Send an invite link to a user's email - generate_link: Generate an email link for various authentication purposes - update_user_by_id: Update user attributes by ID - delete_factor: Delete a factor on a user EXAMPLES: 1. Get user by ID: method: "get_user_by_id" params: {"uid": "user-uuid-here"} 2. Create user: method: "create_user" params: { "email": "user@example.com", "password": "secure-password" } 3. Update user by ID: method: "update_user_by_id" params: { "uid": "user-uuid-here", "attributes": { "email": "new@email.com" } } For complete documentation of all methods and their parameters, use the get_auth_admin_methods_spec tool.

live_dangerously

Toggle unsafe mode for either Management API or Database operations. WHAT THIS TOOL DOES: This tool switches between safe (default) and unsafe operation modes for either the Management API or Database operations. SAFETY MODES EXPLAINED: 1. Database Safety Modes: - SAFE mode (default): Only low-risk operations like SELECT queries are allowed - UNSAFE mode: Higher-risk operations including INSERT, UPDATE, DELETE, and schema changes are permitted 2. API Safety Modes: - SAFE mode (default): Only low-risk operations that don't modify state are allowed - UNSAFE mode: Higher-risk state-changing operations are permitted (except those explicitly blocked for safety) OPERATION RISK LEVELS: The system categorizes operations by risk level: - LOW: Safe read operations with minimal impact - MEDIUM: Write operations that modify data but don't change structure - HIGH: Operations that modify database structure or important system settings - EXTREME: Destructive operations that could cause data loss or service disruption WHEN TO USE THIS TOOL: - Use this tool BEFORE attempting write operations or schema changes - Enable unsafe mode only when you need to perform data modifications - Always return to safe mode after completing write operations USAGE GUIDELINES: - Start in safe mode by default for exploration and analysis - Switch to unsafe mode only when you need to make changes - Be specific about which service you're enabling unsafe mode for - Consider the risks before enabling unsafe mode, especially for database operations - For database operations requiring schema changes, you'll need to enable unsafe mode first Parameters: - service: Which service to toggle ("api" or "database") - enable_unsafe_mode: True to enable unsafe mode, False for safe mode (default: False) Examples: 1. Enable database unsafe mode: live_dangerously(service="database", enable_unsafe_mode=True) 2. Return to safe mode after operations: live_dangerously(service="database", enable_unsafe_mode=False) 3. Enable API unsafe mode: live_dangerously(service="api", enable_unsafe_mode=True) Note: This tool affects ALL subsequent operations for the specified service until changed again.

confirm_destructive_operation

Execute a destructive database or API operation after confirmation. Use this only after reviewing the risks with the user. HOW IT WORKS: - This tool executes a previously rejected high-risk operation using its confirmation ID - The operation will be exactly the same as the one that generated the ID - No need to retype the query or api request params - the system remembers it STEPS: 1. Explain the risks to the user and get their approval 2. Use this tool with the confirmation ID from the error message 3. The original query will be executed as-is PARAMETERS: - operation_type: Type of operation ("api" or "database") - confirmation_id: The ID provided in the error message (required) - user_confirmation: Set to true to confirm execution (default: false) NOTE: Confirmation IDs expire after 5 minutes for security

README

Query | MCP server for Supabase

<p align="center"> <a href="https://thequery.dev"><img src="https://github.com/user-attachments/assets/7e9c49b5-e784-4e70-b39e-7410c22da066" alt="Control Supabase with natural language" width="800" /></a> </p>

<p align="center"> <strong>Query MCP is an open-source MCP server that lets your IDE safely run SQL, manage schema changes, call the Supabase Management API, and use Auth Admin SDK — all with built-in safety controls.</strong> </p>

<p align="center"> ⚡ Free & open-source forever.
💎 Premium features coming soon. 🧪 Early Access is live at <a href="https://thequery.dev">thequery.dev</a>. 📢 Share your feedback on GitHub issues or at feedback@thequery.dev. </p> <p>

<p align="center"> <a href="https://pypi.org/project/supabase-mcp-server/"><img src="https://img.shields.io/pypi/v/supabase-mcp-server.svg" alt="PyPI version" /></a> <a href="https://github.com/alexander-zuev/supabase-mcp-server/actions"><img src="https://github.com/alexander-zuev/supabase-mcp-server/workflows/CI/badge.svg" alt="CI Status" /></a> <a href="https://codecov.io/gh/alexander-zuev/supabase-mcp-server"><img src="https://codecov.io/gh/alexander-zuev/supabase-mcp-server/branch/main/graph/badge.svg" alt="Code Coverage" /></a> <a href="https://www.python.org/downloads/"><img src="https://img.shields.io/badge/python-3.12%2B-blue.svg" alt="Python 3.12+" /></a> <a href="https://github.com/astral-sh/uv"><img src="https://img.shields.io/badge/uv-package%20manager-blueviolet" alt="uv package manager" /></a> <a href="https://pepy.tech/project/supabase-mcp-server"><img src="https://static.pepy.tech/badge/supabase-mcp-server" alt="PyPI Downloads" /></a> <a href="https://smithery.ai/server/@alexander-zuev/supabase-mcp-server"><img src="https://smithery.ai/badge/@alexander-zuev/supabase-mcp-server" alt="Smithery.ai Downloads" /></a> <a href="https://modelcontextprotocol.io/introduction"><img src="https://img.shields.io/badge/MCP-Server-orange" alt="MCP Server" /></a> <a href="LICENSE"><img src="https://img.shields.io/badge/license-Apache%202.0-blue.svg" alt="License" /></a> </p>

Table of contents

<p align="center"> <a href="#getting-started">Getting started</a> • <a href="#feature-overview">Feature overview</a> • <a href="#troubleshooting">Troubleshooting</a> • <a href="#changelog">Changelog</a> </p>

✨ Key features

  • 💻 Compatible with Cursor, Windsurf, Cline and other MCP clients supporting stdio protocol
  • 🔐 Control read-only and read-write modes of SQL query execution
  • 🔍 Runtime SQL query validation with risk level assessment
  • 🛡️ Three-tier safety system for SQL operations: safe, write, and destructive
  • 🔄 Robust transaction handling for both direct and pooled database connections
  • 📝 Automatic versioning of database schema changes
  • 💻 Manage your Supabase projects with Supabase Management API
  • 🧑‍💻 Manage users with Supabase Auth Admin methods via Python SDK
  • 🔨 Pre-built tools to help Cursor & Windsurf work with MCP more effectively
  • 📦 Dead-simple install & setup via package manager (uv, pipx, etc.)

Getting Started

Prerequisites

Installing the server requires the following on your system:

  • Python 3.12+

If you plan to install via uv, ensure it's installed.

PostgreSQL Installation

PostgreSQL installation is no longer required for the MCP server itself, as it now uses asyncpg which doesn't depend on PostgreSQL development libraries.

However, you'll still need PostgreSQL if you're running a local Supabase instance:

MacOS

brew install postgresql@16

Windows

  • Download and install PostgreSQL 16+ from https://www.postgresql.org/download/windows/
  • Ensure "PostgreSQL Server" and "Command Line Tools" are selected during installation

Step 1. Installation

Since v0.2.0 I introduced support for package installation. You can use your favorite Python package manager to install the server via:

# if pipx is installed (recommended)
pipx install supabase-mcp-server

# if uv is installed
uv pip install supabase-mcp-server

pipx is recommended because it creates isolated environments for each package.

You can also install the server manually by cloning the repository and running pipx install -e . from the root directory.

Installing from source

If you would like to install from source, for example for local development:

uv venv
# On Mac
source .venv/bin/activate
# On Windows
.venv\Scripts\activate
# Install package in editable mode
uv pip install -e .

Installing via Smithery.ai

You can find the full instructions on how to use Smithery.ai to connect to this MCP server here.

Step 2. Configuration

The Supabase MCP server requires configuration to connect to your Supabase database, access the Management API, and use the Auth Admin SDK. This section explains all available configuration options and how to set them up.

🔑 Important: Since v0.4 MCP server requires an API key which you can get for free at thequery.dev to use this MCP server.

Environment Variables

The server uses the following environment variables:

Variable Required Default Description
SUPABASE_PROJECT_REF Yes 127.0.0.1:54322 Your Supabase project reference ID (or local host:port)
SUPABASE_DB_PASSWORD Yes postgres Your database password
SUPABASE_REGION Yes* us-east-1 AWS region where your Supabase project is hosted
SUPABASE_ACCESS_TOKEN No None Personal access token for Supabase Management API
SUPABASE_SERVICE_ROLE_KEY No None Service role key for Auth Admin SDK
QUERY_API_KEY Yes None API key from thequery.dev (required for all operations)

Note: The default values are configured for local Supabase development. For remote Supabase projects, you must provide your own values for SUPABASE_PROJECT_REF and SUPABASE_DB_PASSWORD.

🚨 CRITICAL CONFIGURATION NOTE: For remote Supabase projects, you MUST specify the correct region where your project is hosted using SUPABASE_REGION. If you encounter a "Tenant or user not found" error, this is almost certainly because your region setting doesn't match your project's actual region. You can find your project's region in the Supabase dashboard under Project Settings.

Connection Types

Database Connection
  • The server connects to your Supabase PostgreSQL database using the transaction pooler endpoint
  • Local development uses a direct connection to 127.0.0.1:54322
  • Remote projects use the format: postgresql://postgres.[project_ref]:[password]@aws-0-[region].pooler.supabase.com:6543/postgres

⚠️ Important: Session pooling connections are not supported. The server exclusively uses transaction pooling for better compatibility with the MCP server architecture.

Management API Connection
  • Requires SUPABASE_ACCESS_TOKEN to be set
  • Connects to the Supabase Management API at https://api.supabase.com
  • Only works with remote Supabase projects (not local development)
Auth Admin SDK Connection
  • Requires SUPABASE_SERVICE_ROLE_KEY to be set
  • For local development, connects to http://127.0.0.1:54321
  • For remote projects, connects to https://[project_ref].supabase.co

Configuration Methods

The server looks for configuration in this order (highest to lowest priority):

  1. Environment Variables: Values set directly in your environment
  2. Local .env File: A .env file in your current working directory (only works when running from source)
  3. Global Config File:
    • Windows: %APPDATA%\supabase-mcp\.env
    • macOS/Linux: ~/.config/supabase-mcp/.env
  4. Default Settings: Local development defaults (if no other config is found)

⚠️ Important: When using the package installed via pipx or uv, local .env files in your project directory are not detected. You must use either environment variables or the global config file.

Setting Up Configuration

Option 1: Client-Specific Configuration (Recommended)

Set environment variables directly in your MCP client configuration (see client-specific setup instructions in Step 3). Most MCP clients support this approach, which keeps your configuration with your client settings.

Option 2: Global Configuration

Create a global .env configuration file that will be used for all MCP server instances:

# Create config directory
# On macOS/Linux
mkdir -p ~/.config/supabase-mcp
# On Windows (PowerShell)
mkdir -Force "$env:APPDATA\supabase-mcp"

# Create and edit .env file
# On macOS/Linux
nano ~/.config/supabase-mcp/.env
# On Windows (PowerShell)
notepad "$env:APPDATA\supabase-mcp\.env"

Add your configuration values to the file:

QUERY_API_KEY=your-api-key
SUPABASE_PROJECT_REF=your-project-ref
SUPABASE_DB_PASSWORD=your-db-password
SUPABASE_REGION=us-east-1
SUPABASE_ACCESS_TOKEN=your-access-token
SUPABASE_SERVICE_ROLE_KEY=your-service-role-key
Option 3: Project-Specific Configuration (Source Installation Only)

If you're running the server from source (not via package), you can create a .env file in your project directory with the same format as above.

Finding Your Supabase Project Information

  • Project Reference: Found in your Supabase project URL: https://supabase.com/dashboard/project/<project-ref>
  • Database Password: Set during project creation or found in Project Settings → Database
  • Access Token: Generate at https://supabase.com/dashboard/account/tokens
  • Service Role Key: Found in Project Settings → API → Project API keys

Supported Regions

The server supports all Supabase regions:

  • us-west-1 - West US (North California)
  • us-east-1 - East US (North Virginia) - default
  • us-east-2 - East US (Ohio)
  • ca-central-1 - Canada (Central)
  • eu-west-1 - West EU (Ireland)
  • eu-west-2 - West Europe (London)
  • eu-west-3 - West EU (Paris)
  • eu-central-1 - Central EU (Frankfurt)
  • eu-central-2 - Central Europe (Zurich)
  • eu-north-1 - North EU (Stockholm)
  • ap-south-1 - South Asia (Mumbai)
  • ap-southeast-1 - Southeast Asia (Singapore)
  • ap-northeast-1 - Northeast Asia (Tokyo)
  • ap-northeast-2 - Northeast Asia (Seoul)
  • ap-southeast-2 - Oceania (Sydney)
  • sa-east-1 - South America (São Paulo)

Limitations

  • No Self-Hosted Support: The server only supports official Supabase.com hosted projects and local development
  • No Connection String Support: Custom connection strings are not supported
  • No Session Pooling: Only transaction pooling is supported for database connections
  • API and SDK Features: Management API and Auth Admin SDK features only work with remote Supabase projects, not local development

Step 3. Usage

In general, any MCP client that supports stdio protocol should work with this MCP server. This server was explicitly tested to work with:

  • Cursor
  • Windsurf
  • Cline
  • Claude Desktop

Additionally, you can also use smithery.ai to install this server a number of clients, including the ones above.

Follow the guides below to install this MCP server in your client.

Cursor

Go to Settings -> Features -> MCP Servers and add a new server with this configuration:

# can be set to any name
name: supabase
type: command
# if you installed with pipx
command: supabase-mcp-server
# if you installed with uv
command: uv run supabase-mcp-server
# if the above doesn't work, use the full path (recommended)
command: /full/path/to/supabase-mcp-server  # Find with 'which supabase-mcp-server' (macOS/Linux) or 'where supabase-mcp-server' (Windows)

If configuration is correct, you should see a green dot indicator and the number of tools exposed by the server. How successful Cursor config looks like

Windsurf

Go to Cascade -> Click on the hammer icon -> Configure -> Fill in the configuration:

{
    "mcpServers": {
      "supabase": {
        "command": "/Users/username/.local/bin/supabase-mcp-server",  // update path
        "env": {
          "QUERY_API_KEY": "your-api-key",  // Required - get your API key at thequery.dev
          "SUPABASE_PROJECT_REF": "your-project-ref",
          "SUPABASE_DB_PASSWORD": "your-db-password",
          "SUPABASE_REGION": "us-east-1",  // optional, defaults to us-east-1
          "SUPABASE_ACCESS_TOKEN": "your-access-token",  // optional, for management API
          "SUPABASE_SERVICE_ROLE_KEY": "your-service-role-key"  // optional, for Auth Admin SDK
        }
      }
    }
}

If configuration is correct, you should see green dot indicator and clickable supabase server in the list of available servers.

How successful Windsurf config looks like

Claude Desktop

Claude Desktop also supports MCP servers through a JSON configuration. Follow these steps to set up the Supabase MCP server:

  1. Find the full path to the executable (this step is critical):

    # On macOS/Linux
    which supabase-mcp-server
    
    # On Windows
    where supabase-mcp-server
    

    Copy the full path that is returned (e.g., /Users/username/.local/bin/supabase-mcp-server).

  2. Configure the MCP server in Claude Desktop:

    • Open Claude Desktop
    • Go to Settings → Developer -> Edit Config MCP Servers
    • Add a new configuration with the following JSON:
    {
      "mcpServers": {
        "supabase": {
          "command": "/full/path/to/supabase-mcp-server",  // Replace with the actual path from step 1
          "env": {
            "QUERY_API_KEY": "your-api-key",  // Required - get your API key at thequery.dev
            "SUPABASE_PROJECT_REF": "your-project-ref",
            "SUPABASE_DB_PASSWORD": "your-db-password",
            "SUPABASE_REGION": "us-east-1",  // optional, defaults to us-east-1
            "SUPABASE_ACCESS_TOKEN": "your-access-token",  // optional, for management API
            "SUPABASE_SERVICE_ROLE_KEY": "your-service-role-key"  // optional, for Auth Admin SDK
          }
        }
      }
    }
    

⚠️ Important: Unlike Windsurf and Cursor, Claude Desktop requires the full absolute path to the executable. Using just the command name (supabase-mcp-server) will result in a "spawn ENOENT" error.

If configuration is correct, you should see the Supabase MCP server listed as available in Claude Desktop.

How successful Windsurf config looks like

Cline

Cline also supports MCP servers through a similar JSON configuration. Follow these steps to set up the Supabase MCP server:

  1. Find the full path to the executable (this step is critical):

    # On macOS/Linux
    which supabase-mcp-server
    
    # On Windows
    where supabase-mcp-server
    

    Copy the full path that is returned (e.g., /Users/username/.local/bin/supabase-mcp-server).

  2. Configure the MCP server in Cline:

    • Open Cline in VS Code
    • Click on the "MCP Servers" tab in the Cline sidebar
    • Click "Configure MCP Servers"
    • This will open the cline_mcp_settings.json file
    • Add the following configuration:
    {
      "mcpServers": {
        "supabase": {
          "command": "/full/path/to/supabase-mcp-server",  // Replace with the actual path from step 1
          "env": {
            "QUERY_API_KEY": "your-api-key",  // Required - get your API key at thequery.dev
            "SUPABASE_PROJECT_REF": "your-project-ref",
            "SUPABASE_DB_PASSWORD": "your-db-password",
            "SUPABASE_REGION": "us-east-1",  // optional, defaults to us-east-1
            "SUPABASE_ACCESS_TOKEN": "your-access-token",  // optional, for management API
            "SUPABASE_SERVICE_ROLE_KEY": "your-service-role-key"  // optional, for Auth Admin SDK
          }
        }
      }
    }
    

If configuration is correct, you should see a green indicator next to the Supabase MCP server in the Cline MCP Servers list, and a message confirming "supabase MCP server connected" at the bottom of the panel.

How successful configuration in Cline looks like

Troubleshooting

Here are some tips & tricks that might help you:

  • Debug installation - run supabase-mcp-server directly from the terminal to see if it works. If it doesn't, there might be an issue with the installation.
  • MCP Server configuration - if the above step works, it means the server is installed and configured correctly. As long as you provided the right command, IDE should be able to connect. Make sure to provide the right path to the server executable.
  • "No tools found" error - If you see "Client closed - no tools available" in Cursor despite the package being installed:
    • Find the full path to the executable by running which supabase-mcp-server (macOS/Linux) or where supabase-mcp-server (Windows)
    • Use the full path in your MCP server configuration instead of just supabase-mcp-server
    • For example: /Users/username/.local/bin/supabase-mcp-server or C:\Users\username\.local\bin\supabase-mcp-server.exe
  • Environment variables - to connect to the right database, make sure you either set env variables in mcp_config.json or in .env file placed in a global config directory (~/.config/supabase-mcp/.env on macOS/Linux or %APPDATA%\supabase-mcp\.env on Windows).
  • Accessing logs - The MCP server writes detailed logs to a file:
    • Log file location:
      • macOS/Linux: ~/.local/share/supabase-mcp/mcp_server.log
      • Windows: %USERPROFILE%\.local\share\supabase-mcp\mcp_server.log
    • Logs include connection status, configuration details, and operation results
    • View logs using any text editor or terminal commands:
      # On macOS/Linux
      cat ~/.local/share/supabase-mcp/mcp_server.log
      
      # On Windows (PowerShell)
      Get-Content "$env:USERPROFILE\.local\share\supabase-mcp\mcp_server.log"
      

If you are stuck or any of the instructions above are incorrect, please raise an issue.

MCP Inspector

A super useful tool to help debug MCP server issues is MCP Inspector. If you installed from source, you can run supabase-mcp-inspector from the project repo and it will run the inspector instance. Coupled with logs this will give you complete overview over what's happening in the server.

📝 Running supabase-mcp-inspector, if installed from package, doesn't work properly - I will validate and fix in the coming release.

Feature Overview

Database query tools

Since v0.3+ server provides comprehensive database management capabilities with built-in safety controls:

  • SQL Query Execution: Execute PostgreSQL queries with risk assessment

    • Three-tier safety system:
      • safe: Read-only operations (SELECT) - always allowed
      • write: Data modifications (INSERT, UPDATE, DELETE) - require unsafe mode
      • destructive: Schema changes (DROP, CREATE) - require unsafe mode + confirmation
  • SQL Parsing and Validation:

    • Uses PostgreSQL's parser (pglast) for accurate analysis and provides clear feedback on safety requirements
  • Automatic Migration Versioning:

    • Database-altering operations operations are automatically versioned
    • Generates descriptive names based on operation type and target
  • Safety Controls:

    • Default SAFE mode allows only read-only operations
    • All statements run in transaction mode via asyncpg
    • 2-step confirmation for high-risk operations
  • Available Tools:

    • get_schemas: Lists schemas with sizes and table counts
    • get_tables: Lists tables, foreign tables, and views with metadata
    • get_table_schema: Gets detailed table structure (columns, keys, relationships)
    • execute_postgresql: Executes SQL statements against your database
    • confirm_destructive_operation: Executes high-risk operations after confirmation
    • retrieve_migrations: Gets migrations with filtering and pagination options
    • live_dangerously: Toggles between safe and unsafe modes

Management API tools

Since v0.3.0 server provides secure access to the Supabase Management API with built-in safety controls:

  • Available Tools:

    • send_management_api_request: Sends arbitrary requests to Supabase Management API with auto-injection of project ref
    • get_management_api_spec: Gets the enriched API specification with safety information
      • Supports multiple query modes: by domain, by specific path/method, or all paths
      • Includes risk assessment information for each endpoint
      • Provides detailed parameter requirements and response formats
      • Helps LLMs understand the full capabilities of the Supabase Management API
    • get_management_api_safety_rules: Gets all safety rules with human-readable explanations
    • live_dangerously: Toggles between safe and unsafe operation modes
  • Safety Controls:

    • Uses the same safety manager as database operations for consistent risk management
    • Operations categorized by risk level:
      • safe: Read-only operations (GET) - always allowed
      • unsafe: State-changing operations (POST, PUT, PATCH, DELETE) - require unsafe mode
      • blocked: Destructive operations (delete project, etc.) - never allowed
    • Default safe mode prevents accidental state changes
    • Path-based pattern matching for precise safety rules

Note: Management API tools only work with remote Supabase instances and are not compatible with local Supabase development setups.

Auth Admin tools

I was planning to add support for Python SDK methods to the MCP server. Upon consideration I decided to only add support for Auth admin methods as I often found myself manually creating test users which was prone to errors and time consuming. Now I can just ask Cursor to create a test user and it will be done seamlessly. Check out the full Auth Admin SDK method docs to know what it can do.

Since v0.3.6 server supports direct access to Supabase Auth Admin methods via Python SDK:

  • Includes the following tools:
    • get_auth_admin_methods_spec to retrieve documentation for all available Auth Admin methods
    • call_auth_admin_method to directly invoke Auth Admin methods with proper parameter handling
  • Supported methods:
    • get_user_by_id: Retrieve a user by their ID
    • list_users: List all users with pagination
    • create_user: Create a new user
    • delete_user: Delete a user by their ID
    • invite_user_by_email: Send an invite link to a user's email
    • generate_link: Generate an email link for various authentication purposes
    • update_user_by_id: Update user attributes by ID
    • delete_factor: Delete a factor on a user (currently not implemented in SDK)

Why use Auth Admin SDK instead of raw SQL queries?

The Auth Admin SDK provides several key advantages over direct SQL manipulation:

  • Functionality: Enables operations not possible with SQL alone (invites, magic links, MFA)

  • Accuracy: More reliable then creating and executing raw SQL queries on auth schemas

  • Simplicity: Offers clear methods with proper validation and error handling

    • Response format:
      • All methods return structured Python objects instead of raw dictionaries
      • Object attributes can be accessed using dot notation (e.g., user.id instead of user["id"])
    • Edge cases and limitations:
      • UUID validation: Many methods require valid UUID format for user IDs and will return specific validation errors
      • Email configuration: Methods like invite_user_by_email and generate_link require email sending to be configured in your Supabase project
      • Link types: When generating links, different link types have different requirements:
        • signup links don't require the user to exist
        • magiclink and recovery links require the user to already exist in the system
      • Error handling: The server provides detailed error messages from the Supabase API, which may differ from the dashboard interface
      • Method availability: Some methods like delete_factor are exposed in the API but not fully implemented in the SDK

Logs & Analytics

The server provides access to Supabase logs and analytics data, making it easier to monitor and troubleshoot your applications:

  • Available Tool: retrieve_logs - Access logs from any Supabase service

  • Log Collections:

    • postgres: Database server logs
    • api_gateway: API gateway requests
    • auth: Authentication events
    • postgrest: RESTful API service logs
    • pooler: Connection pooling logs
    • storage: Object storage operations
    • realtime: WebSocket subscription logs
    • edge_functions: Serverless function executions
    • cron: Scheduled job logs
    • pgbouncer: Connection pooler logs
  • Features: Filter by time, search text, apply field filters, or use custom SQL queries

Simplifies debugging across your Supabase stack without switching between interfaces or writing complex queries.

Automatic Versioning of Database Changes

"With great power comes great responsibility." While execute_postgresql tool coupled with aptly named live_dangerously tool provide a powerful and simple way to manage your Supabase database, it also means that dropping a table or modifying one is one chat message away. In order to reduce the risk of irreversible changes, since v0.3.8 the server supports:

  • automatic creation of migration scripts for all write & destructive sql operations executed on the database
  • improved safety mode of query execution, in which all queries are categorized in:
    • safe type: always allowed. Includes all read-only ops.
    • writetype: requires write mode to be enabled by the user.
    • destructive type: requires write mode to be enabled by the user AND a 2-step confirmation of query execution for clients that do not execute tools automatically.

Universal Safety Mode

Since v0.3.8 Safety Mode has been standardized across all services (database, API, SDK) using a universal safety manager. This provides consistent risk management and a unified interface for controlling safety settings across the entire MCP server.

All operations (SQL queries, API requests, SDK methods) are categorized into risk levels:

  • Low risk: Read-only operations that don't modify data or structure (SELECT queries, GET API requests)
  • Medium risk: Write operations that modify data but not structure (INSERT/UPDATE/DELETE, most POST/PUT API requests)
  • High risk: Destructive operations that modify database structure or could cause data loss (DROP/TRUNCATE, DELETE API endpoints)
  • Extreme risk: Operations with severe consequences that are blocked entirely (deleting projects)

Safety controls are applied based on risk level:

  • Low risk operations are always allowed
  • Medium risk operations require unsafe mode to be enabled
  • High risk operations require unsafe mode AND explicit confirmation
  • Extreme risk operations are never allowed

How confirmation flow works

Any high-risk operations (be it a postgresql or api request) will be blocked even in unsafe mode. Every high-risk operation is blocked You will have to confirm and approve every high-risk operation explicitly in order for it to be executed. Explicit approval is always required

Changelog

  • 📦 Simplified installation via package manager - ✅ (v0.2.0)
  • 🌎 Support for different Supabase regions - ✅ (v0.2.2)
  • 🎮 Programmatic access to Supabase management API with safety controls - ✅ (v0.3.0)
  • 👷‍♂️ Read and read-write database SQL queries with safety controls - ✅ (v0.3.0)
  • 🔄 Robust transaction handling for both direct and pooled connections - ✅ (v0.3.2)
  • 🐍 Support methods and objects available in native Python SDK - ✅ (v0.3.6)
  • 🔍 Stronger SQL query validation ✅ (v0.3.8)
  • 📝 Automatic versioning of database changes ✅ (v0.3.8)
  • 📖 Radically improved knowledge and tools of api spec ✅ (v0.3.8)
  • ✍️ Improved consistency of migration-related tools for a more organized database vcs ✅ (v0.3.10)
  • 🥳 Query MCP is released (v0.4.0)

For a more detailed roadmap, please see this discussion on GitHub.

Star History

Star History Chart


Enjoy! ☺️

Recommended Servers

VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
AIO-MCP Server

AIO-MCP Server

🚀 All-in-one MCP server with AI search, RAG, and multi-service integrations (GitLab/Jira/Confluence/YouTube) for AI-enhanced development workflows. Folk from

Featured
Local
Persistent Knowledge Graph

Persistent Knowledge Graph

An implementation of persistent memory for Claude using a local knowledge graph, allowing the AI to remember information about users across conversations with customizable storage location.

Featured
Local
Hyperbrowser MCP Server

Hyperbrowser MCP Server

Welcome to Hyperbrowser, the Internet for AI. Hyperbrowser is the next-generation platform empowering AI agents and enabling effortless, scalable browser automation. Built specifically for AI developers, it eliminates the headaches of local infrastructure and performance bottlenecks, allowing you to

Featured
Local
Any OpenAI Compatible API Integrations

Any OpenAI Compatible API Integrations

Integrate Claude with Any OpenAI SDK Compatible Chat Completion API - OpenAI, Perplexity, Groq, xAI, PyroPrompts and more.

Featured
Exa MCP

Exa MCP

A Model Context Protocol server that enables AI assistants like Claude to perform real-time web searches using the Exa AI Search API in a safe and controlled manner.

Featured
BigQuery

BigQuery

This is a server that lets your LLMs (like Claude) talk directly to your BigQuery data! Think of it as a friendly translator that sits between your AI assistant and your database, making sure they can chat securely and efficiently.

Featured
Web Research Server

Web Research Server

A Model Context Protocol server that enables Claude to perform web research by integrating Google search, extracting webpage content, and capturing screenshots.

Featured