Datasette MCP
A Model Context Protocol server that provides read-only access to Datasette instances, enabling AI assistants to explore, query, and analyze data from Datasette databases through a standardized interface.
README
Datasette MCP
A Model Context Protocol (MCP) server that provides read-only access to Datasette instances. This server enables AI assistants to explore, query, and analyze data from Datasette databases through a standardized interface.
Features
- SQL Query Execution: Run custom SQL queries against Datasette databases
- Full-Text Search: Search within tables using Datasette's FTS capabilities
- Schema Exploration: List databases, tables, and inspect table schemas
- Multiple Instances: Connect to multiple Datasette instances simultaneously
- Authentication: Support for Bearer token authentication
- Request Throttling: Configurable courtesy delays between requests
- Multiple Transports: stdio, HTTP, and Server-Sent Events support
Installation
Prerequisites
- Python 3.10+
- uv package manager
Install as a tool
# Install directly from GitHub
uv tool install git+https://github.com/mhalle/datasette-mcp.git
# Check installation
datasette-mcp --help
Development installation
# Clone and install for development
git clone https://github.com/mhalle/datasette-mcp.git
cd datasette-mcp
uv sync
uv run datasette-mcp --help
Configuration
The server supports two configuration methods:
1. Configuration File
Create a YAML or JSON configuration file with your Datasette instances:
# ~/.config/datasette-mcp/config.yaml
datasette_instances:
my_database:
url: "https://my-datasette.herokuapp.com"
description: "My production database"
auth_token: "your-api-token-here" # optional
local_dev:
url: "http://localhost:8001"
description: "Local development database"
# Global settings (optional)
courtesy_delay_seconds: 0.5 # delay between requests
The server automatically searches for config files in:
$DATASETTE_MCP_CONFIGenvironment variable~/.config/datasette-mcp/config.{yaml,yml,json}/etc/datasette-mcp/config.{yaml,yml,json}
2. Command Line (Single Instance)
For quick single-instance setup:
datasette-mcp \
--url https://my-datasette.herokuapp.com \
--id my_db \
--description "My database"
Usage
Basic Startup
# Use auto-discovered config file
datasette-mcp
# Use specific config file
datasette-mcp --config /path/to/config.yaml
# Single instance mode
datasette-mcp --url https://example.com --id mydb
Transport Options
# stdio (default, for MCP clients)
datasette-mcp
# HTTP server
datasette-mcp --transport streamable-http --port 8080
# Server-Sent Events
datasette-mcp --transport sse --host 0.0.0.0 --port 8080
Development Usage
When developing or testing:
# Run from source with uv
uv run datasette-mcp --url https://example.com
# Install in development mode
uv tool install --editable .
All CLI Options
--config CONFIG Path to configuration file
--url URL Datasette instance URL for single instance mode
--id ID Instance ID (optional, derived from URL if not specified)
--description DESC Description for the instance
--courtesy-delay FLOAT Delay between requests in seconds
--transport TRANSPORT Protocol: stdio, streamable-http, sse
--host HOST Host for HTTP transports (default: 127.0.0.1)
--port PORT Port for HTTP transports (default: 8198)
--log-level LEVEL Logging level: DEBUG, INFO, WARNING, ERROR
Claude Code Integration
To use this MCP server with Claude Code:
1. Install the server
uv tool install git+https://github.com/mhalle/datasette-mcp.git
2. Add to Claude Code
claude mcp add datasette-mcp -- datasette-mcp --url https://your-datasette-instance.com
Or with a configuration file:
claude mcp add datasette-mcp -- datasette-mcp --config /path/to/config.yaml
3. Use with scopes (optional)
claude mcp add -s data-analysis datasette-mcp -- datasette-mcp --url https://analytics.example.com
Once added, Claude Code will have access to explore and query your Datasette instances directly within conversations.
Available Tools
The server provides these MCP tools for AI assistants:
list_instances()
List all configured Datasette instances and their details.
list_databases(instance)
List all databases in a Datasette instance with table counts.
describe_database(instance, database)
Get complete database schema including all table structures, columns, types, and relationships in one efficient call.
execute_sql(instance, database, sql, ...)
Execute custom SQL queries with options for:
shape: Response format ("objects", "arrays", "array")json_columns: Parse specific columns as JSONtrace: Include performance trace informationtimelimit: Query timeout in millisecondssize: Maximum number of results per pagenext_token: Pagination token for getting next page
search_table(instance, database, table, search_term, ...)
Perform full-text search within a table with options for:
search_column: Search only in specific columncolumns: Return only specific columns to reduce tokensraw_mode: Enable advanced FTS operators (AND, OR, NOT)size: Maximum number of results per pagenext_token: Pagination token for getting next page
Usage Examples
Exploring Data Structure
# List available instances
instances = await list_instances()
# Explore a specific instance
databases = await list_databases("my_database")
# Get complete database schema with all tables
schema = await describe_database("my_database", "main")
Querying Data
# Get recent users
users = await execute_sql(
"my_database",
"main",
"SELECT * FROM users ORDER BY created_date DESC LIMIT 10"
)
# Search for specific content with limited columns
results = await search_table(
"my_database",
"main",
"posts",
"machine learning",
columns=["title", "content", "author"],
size=20
)
Advanced Queries
# Complex aggregation with pagination
stats = await execute_sql(
"my_database",
"main",
"""
SELECT category, COUNT(*) as count, AVG(price) as avg_price
FROM products
WHERE created_date > '2024-01-01'
GROUP BY category
ORDER BY count DESC
""",
size=50
)
# Search with advanced operators
results = await search_table(
"my_database",
"main",
"articles",
"python AND (fastapi OR django)",
raw_mode=True
)
Security Considerations
- The server provides read-only access to Datasette instances
- Authentication tokens are passed as Bearer tokens to Datasette
- No write operations are supported
- SQL queries are subject to Datasette's built-in security restrictions
- Request throttling helps prevent overwhelming target servers
Error Handling
The server provides detailed error messages for:
- Invalid SQL queries
- Missing or inaccessible databases/tables
- Authentication failures
- Network timeouts
- Configuration errors
Logging
Configure logging levels for debugging:
datasette-mcp --log-level DEBUG
Log levels: DEBUG, INFO, WARNING, ERROR
Tool Management
# List installed tools
uv tool list
# Upgrade to latest version
uv tool upgrade datasette-mcp
# Uninstall
uv tool uninstall datasette-mcp
Contributing
This server is built with FastMCP, making it easy to extend with additional tools and functionality. The codebase follows MCP best practices for server development.
License
Licensed under the Apache License, Version 2.0. See LICENSE for details.
Related Projects
- Datasette - Data exploration tool
- FastMCP - Python MCP framework
- Model Context Protocol - Standard for AI tool integration
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.