API Registry MCP Server
Enables discovery, registration, and management of external API endpoints through natural language, supporting multiple authentication methods (public, API key, bearer token) with automatic endpoint testing and documentation parsing.
README
🔌 API Registry MCP Server
A Databricks app that helps you discover, register, and manage external API endpoints with an AI-powered chat interface.
What is this?
An API discovery and management platform that runs on Databricks Apps:
- 🤖 AI Chat Interface: Register APIs using natural language powered by Claude
- 📊 API Registry: Database-backed catalog of external API endpoints
- 🔐 Secure Auth: Support for public APIs, API keys, and bearer tokens
- 🛠️ MCP Server: Programmatic API management tools
- 📚 Smart Discovery: Automatic endpoint testing and documentation parsing
Quick Start
Prerequisites
Required Tools (Install on your local machine)
1. Python Package Manager - uv:
# macOS/Linux
curl -LsSf https://astral.sh/uv/install.sh | sh
# Or with Homebrew
brew install uv
# Verify installation
uv --version
2. Databricks CLI:
# With pip
pip install databricks-cli
# Or with Homebrew
brew tap databricks/tap
brew install databricks
# Verify installation
databricks --version # Should be v0.260.0+
3. Bun (Optional - only for frontend development):
# macOS/Linux
curl -fsSL https://bun.sh/install | bash
# Or with Homebrew
brew install oven-sh/bun/bun
Databricks Workspace Requirements
Your workspace needs:
- Databricks Apps enabled (Public Preview)
- Foundation Model API with a tool-enabled model (Claude, Llama, etc.)
- SQL Warehouse - At least one warehouse (create one)
- Unity Catalog - With a catalog and schema you can write to
📖 Detailed requirements: WORKSPACE_REQUIREMENTS.md
Step 1: Clone and Setup
Run this on your local machine (not in Databricks):
git clone https://github.com/lucamilletti99/mcp_api_registry_http.git
cd mcp_api_registry_http
./setup.sh
The setup script will prompt you for:
| Prompt | What It's For | Default | Notes |
|---|---|---|---|
| Databricks Host | Your workspace URL | (no default) | Format: https://your-workspace.cloud.databricks.com |
| Authentication Method | How to authenticate | 2 (PAT - Recommended) |
Options: 1=OAuth, 2=PAT |
| Personal Access Token | Your Databricks PAT | (no default) | Required for PAT auth. Get your PAT here |
| SQL Warehouse ID | Warehouse for queries | Auto-detects first warehouse | Press Enter to use default |
| Unity Catalog | Target catalog | main |
Press Enter to use default |
| Unity Schema | Target schema | default |
Press Enter to use default |
⚠️ Important: Use Personal Access Token (PAT) authentication
- PAT is the recommended method for local development
- OAuth is experimental and may have issues
- Get your PAT: Workspace → Settings → Developer → Access Tokens → Generate New Token
- Full PAT documentation
What this does:
- Installs Python and JavaScript dependencies
- Configures Databricks CLI authentication
- Creates
.env.localwith your configuration - Validates your workspace connection
Step 2: Create the API Registry Table
Create the Delta table that stores API metadata:
uv run python setup_table.py your_catalog your_schema
Example:
# Using the defaults from Step 1
uv run python setup_table.py main default
What this does:
- Creates
api_http_registrytable in your specified catalog.schema - Table stores: API name, endpoints, auth type, HTTP connection details, parameters
- Required for the app to track registered APIs
Alternative - Manual SQL:
Run the SQL from setup_api_http_registry_table.sql in Databricks SQL Editor
Note: Ensure your catalog and schema exist first. Create them in Databricks SQL Editor if needed.
Step 3: Deploy to Databricks Apps
Deploy your application code to Databricks:
# First time deployment (creates the app)
./deploy.sh --create
# Future updates (after code changes)
./deploy.sh
During deployment, you'll be prompted for:
- App name: Must start with
mcp-(e.g.,mcp-api-registry,mcp-prod-api)
What happens during deployment:
- ✅ Builds the frontend - Compiles React TypeScript to static assets
- ✅ Packages the backend - Prepares FastAPI server and MCP tools
- ✅ Creates Databricks App - Registers your app in the workspace
- ✅ Generates Service Principal - Automatically creates a service principal for your app
- ✅ Deploys code to the app - Uploads your code and automatically attaches it to the app compute
- ✅ Starts the application - Your app is now running and accessible
- ✅ Enables OAuth (OBO) - Configures On-Behalf-Of authentication automatically
⚠️ Important: No manual attachment needed!
The deploy.sh script handles the entire deployment pipeline. Your code is automatically:
- Packaged into a deployable artifact
- Uploaded to Databricks
- Attached to the app's compute environment
- Started and made accessible at the app URL
You don't need to manually connect code to compute - it's all handled by the deployment process!
Finding your deployed app:
# Get app URL and status
./app_status.sh
# Expected output:
# App: mcp-api-registry
# Status: RUNNING
# URL: https://adb-123456.10.azuredatabricks.net//apps/mcp-api-registry
# Service Principal ID: 00000000-0000-0000-0000-000000000000
Or in Databricks UI:
- Workspace → Compute → Apps → Click your app name
🔐 On-Behalf-Of (OBO) Authentication:
Databricks Apps automatically handles OAuth authentication:
- ✅ Users log in through Databricks UI - no separate auth setup
- ✅ All operations run with the user's permissions - proper access control
- ✅ Full audit logging - track who did what
- ✅ No manual OAuth configuration needed!
The app configuration (app.yaml) specifies required scopes. When users access the app, they automatically get an OAuth token with their Databricks permissions.
📖 More details: See app.yaml in the project root
Step 4: Setup Secret Scopes (For Authenticated APIs)
⚠️ Important: Do this AFTER Step 3 - You need the Service Principal ID from deployment first!
Skip if you only use public APIs with no authentication.
For APIs requiring API keys or bearer tokens:
./setup_shared_secrets.sh
When prompted, enter your app's Service Principal ID from Step 3.
Where to find your Service Principal ID:
- From terminal: Run
./app_status.sh(shown in output) - From UI: Databricks workspace → Compute → Apps → Click your app → "Service Principal ID"
- Format: Looks like
00000000-0000-0000-0000-000000000000
What this script does:
- Creates
mcp_api_keysscope - for API key authentication - Creates
mcp_bearer_tokensscope - for bearer token authentication - Grants your app's service principal WRITE access to both scopes
- Verifies the permissions were set correctly
Why this is needed:
- API keys and bearer tokens must be stored securely
- Databricks Secrets provide encryption at rest
- The app's service principal manages secrets on behalf of all users
- Users never see or handle raw credentials - they're encrypted automatically
Verification:
# Check both scopes exist
databricks secrets list-scopes | grep mcp_
# Check service principal has WRITE access
databricks secrets get-acl mcp_api_keys --principal YOUR_SPN_ID
databricks secrets get-acl mcp_bearer_tokens --principal YOUR_SPN_ID
# Expected output: permission: WRITE
Troubleshooting:
- If scope creation fails: You may need admin permissions
- If permission grant fails: Your SPN ID may be incorrect (check
./app_status.sh)
📖 Detailed guide: SECRETS_WORKAROUND.md
API Authentication Types
The app supports three authentication types:
| Type | When to Use | Where Credential Goes | Example APIs |
|---|---|---|---|
| none | Public APIs with no auth | N/A | Treasury, Public datasets |
| api_key | Key passed as query param | ?api_key=xxx in URL |
FRED, Alpha Vantage, NewsAPI |
| bearer_token | Token passed in header | Authorization: Bearer xxx |
GitHub, Stripe, Shopify |
Quick Examples
Public API (no auth):
"Register the Treasury Fiscal Data API at
https://api.fiscaldata.treasury.gov/services/api/fiscal_service/v1/accounting/od/rates_of_exchange"
API Key Authentication:
"Register the FRED API at https://api.stlouisfed.org/fred/series/observations
Use API key authentication with key: YOUR_API_KEY_HERE"
Bearer Token Authentication:
"Register the GitHub API at https://api.github.com/user/repos
Use bearer token authentication with token: ghp_YOUR_TOKEN_HERE"
How It Works
API Key Auth:
- Key stored in
mcp_api_keysscope - HTTP connection has empty bearer_token
- Key retrieved from secrets and added to params at runtime
Bearer Token Auth:
- Token stored in
mcp_bearer_tokensscope - HTTP connection references the secret
- Databricks automatically adds
Authorization: Bearer <token>header
📖 Detailed auth mechanics: See "API Authentication Types" section in SECRETS_WORKAROUND.md
Using the App
Web Interface
Open your app URL to access:
- Chat Playground - Natural language API registration and queries
- API Registry - View, edit, delete registered APIs
- Traces - Debug AI agent execution
- MCP Info - View available MCP tools
Example Workflow
You: "Register the FRED economic data API with my API key: abc123"
AI: ✅ Successfully registered "fred" with API key authentication
You: "Get GDP data from FRED, series GDPC1"
AI: [Retrieves API key from secrets, makes request]
Here's the GDP data from the last 10 observations...
Configuration
Environment Variables (.env.local)
Created automatically by ./setup.sh:
DATABRICKS_HOST=https://your-workspace.cloud.databricks.com
DATABRICKS_TOKEN=your-personal-access-token # For local dev
DATABRICKS_SQL_WAREHOUSE_ID=your-warehouse-id # Optional
# Optional: Override default secret scope names
MCP_API_KEY_SCOPE=mcp_api_keys
MCP_BEARER_TOKEN_SCOPE=mcp_bearer_tokens
Authentication
The app uses On-Behalf-Of (OBO) authentication by default:
- Users authenticate with Databricks OAuth
- All operations run with the user's permissions
- Proper access control and audit logging
📖 OBO details: See app.yaml configuration in the project root
Development
Local Development
# Start dev server with hot reload
./watch.sh
# Access at:
# - Frontend: http://localhost:5173
# - Backend: http://localhost:8000
# - API Docs: http://localhost:8000/docs
Debugging
# Check app status
./app_status.sh
# Stream app logs
uv run python dba_logz.py https://your-app.databricksapps.com --duration 60
# Format code
./fix.sh
Multiple Environments
Deploy separate instances for dev/staging/prod:
./deploy.sh --app-name mcp-dev-registry --create
./deploy.sh --app-name mcp-prod-registry --create
Project Structure
├── server/ # FastAPI backend
│ ├── app.py # Main app + MCP server
│ ├── tools.py # MCP tools implementation
│ └── routers/ # API endpoints
├── client/ # React TypeScript frontend
│ └── src/pages/ # Chat, Registry, Traces pages
├── prompts/ # Agent system prompts
├── setup_table.py # DB table setup script
├── deploy.sh # Deploy to Databricks Apps
├── setup.sh # Interactive setup
└── watch.sh # Local dev server
Troubleshooting
Deployment Issues - App Created But Code Not Working
If ./deploy.sh completes successfully but your app doesn't work properly, follow these steps:
1. Check App Logs (MOST IMPORTANT):
# View live logs
databricks apps logs <your-app-name> --follow
# Or visit in browser (requires OAuth):
# https://your-app.databricksapps.com/logz
2. Verify App Status:
./app_status.sh
# Should show: Status: RUNNING
# If status is FAILED or ERROR, check logs above
3. Common Causes & Fixes:
| Issue | Check | Fix |
|---|---|---|
| Frontend build failed | cd client && npm run build |
Fix TypeScript errors, ensure client/node_modules exists |
| Missing Python dependencies | cat requirements.txt |
Run uv run python scripts/generate_semver_requirements.py |
| app.yaml misconfigured | cat app.yaml |
Verify command and scopes are correct |
| Code not uploaded | databricks workspace ls /Workspace/Users/your.email@company.com/ |
Check if source path exists, redeploy with --verbose |
| App won't start | Check app logs | Look for Python import errors, missing env vars, port conflicts |
4. Redeploy with Verbose Output:
./deploy.sh --verbose
# Shows detailed build and deployment steps
5. Manual Verification:
# Check app exists and get details
databricks apps get <your-app-name>
# Verify service principal was created
databricks apps get <your-app-name> --output json | grep service_principal_id
# Try restarting
databricks apps restart <your-app-name>
# Last resort: Delete and recreate
databricks apps delete <your-app-name>
./deploy.sh --create
Authentication failures:
- Run:
databricks current-user meto verify CLI auth - Check
.env.localhas correctDATABRICKS_HOST
Table not found:
- Run
setup_table.pyor manually create via SQL Editor
Secret scope errors:
# Verify scopes exist:
databricks secrets list-scopes | grep mcp_
# Verify service principal has access:
databricks secrets get-acl --scope mcp_api_keys --principal <service-principal-id>
# Check what secrets exist:
databricks secrets list-secrets --scope mcp_api_keys
App not accessible:
- Check deployment:
./app_status.sh - View logs:
https://your-app.databricksapps.com/logz
API calls failing after registration:
- Verify secret exists:
databricks secrets list-secrets --scope mcp_api_keys - Check app logs for connection creation errors
- For API key auth: Ensure key is in
mcp_api_keysscope - For bearer token auth: Ensure token is in
mcp_bearer_tokensscope
📖 Detailed troubleshooting:
- WORKSPACE_REQUIREMENTS.md - Workspace setup issues
- SECRETS_WORKAROUND.md - Secret scope issues
Key Features
MCP Tools Available
The app exposes these tools via its MCP server:
smart_register_api- One-step API registration with auto-discoveryregister_api_in_registry- Manual API registration with full controlcheck_api_http_registry- List and search registered APIsdiscover_endpoints_from_docs- Extract endpoints from documentation URLstest_api_endpoint- Validate endpoints before registrationexecute_dbsql- Run SQL queries against warehouses
AI Agent Capabilities
The chat interface can:
- Parse API documentation to discover endpoints
- Test endpoints automatically
- Register APIs with proper authentication
- Call registered APIs to answer queries
- Combine multiple API calls for complex requests
Documentation
- WORKSPACE_REQUIREMENTS.md - Prerequisites, setup, workspace configuration
- SECRETS_WORKAROUND.md - Secret management, auth types, troubleshooting
- SECURITY.md - Security policies
- LICENSE.md - License information
License
See LICENSE.md
Security
Report vulnerabilities: See SECURITY.md
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.