ms-fabric-mcp-server
Enables AI agents to interact with Microsoft Fabric by exposing tools for managing workspaces, notebooks, SQL queries, pipelines, and Livy Spark sessions. It provides a comprehensive set of operations for data engineering and analytics tasks using standard Azure authentication.
README
ms-fabric-mcp-server
A Model Context Protocol (MCP) server for Microsoft Fabric. Exposes Fabric operations (workspaces, notebooks, SQL, Livy, pipelines, jobs) as MCP tools that AI agents can invoke.
⚠️ Warning: This package is intended for development environments only and should not be used in production. It includes tools that can perform destructive operations (e.g.,
delete_notebook,delete_item) and execute arbitrary code via Livy Spark sessions. Always review AI-generated tool calls before execution.
Quick Start
The fastest way to use this MCP server is with uvx:
uvx ms-fabric-mcp-server
Installation
# Using uv (recommended)
uv pip install ms-fabric-mcp-server
# Using pip
pip install ms-fabric-mcp-server
# With SQL support (requires pyodbc)
pip install ms-fabric-mcp-server[sql]
# With OpenTelemetry tracing
pip install ms-fabric-mcp-server[sql,telemetry]
Authentication
Uses DefaultAzureCredential from azure-identity - no explicit credential configuration needed. This automatically tries multiple authentication methods:
- Environment credentials (
AZURE_CLIENT_ID,AZURE_TENANT_ID,AZURE_CLIENT_SECRET) - Managed Identity (when running on Azure)
- Azure CLI credentials (
az login) - VS Code credentials
- Azure PowerShell credentials
No Fabric-specific auth environment variables are needed - it just works if you're authenticated via any of the above methods.
Usage
VS Code Integration
Add to your VS Code MCP settings (.vscode/mcp.json or User settings):
{
"servers": {
"MS Fabric MCP Server": {
"type": "stdio",
"command": "uvx",
"args": ["ms-fabric-mcp-server"]
}
}
}
Claude Desktop Integration
Add to your claude_desktop_config.json:
{
"mcpServers": {
"fabric": {
"command": "uvx",
"args": ["ms-fabric-mcp-server"]
}
}
}
Codex Integration
Add to your Codex config.toml:
[mcp_servers.ms_fabric_mcp]
command = "uvx"
args = ["ms-fabric-mcp-server"]
Running Standalone
# Using uvx (no installation needed)
uvx ms-fabric-mcp-server
# Direct execution (if installed)
ms-fabric-mcp-server
# Via Python module
python -m ms_fabric_mcp_server
# With MCP Inspector (development)
npx @modelcontextprotocol/inspector uvx ms-fabric-mcp-server
Logging & Debugging (optional)
MCP stdio servers must keep protocol traffic on stdout, so redirect stderr to capture logs.
Giving the agent read access to the log file is a powerful way to debug failures.
You can also set AZURE_LOG_LEVEL (Azure SDK) and MCP_LOG_LEVEL (server) to control verbosity.
VS Code (Bash):
{
"servers": {
"MS Fabric MCP Server": {
"type": "stdio",
"command": "bash",
"args": [
"-lc",
"LOG_DIR=\"$HOME/mcp_logs\"; LOG_FILE=\"$LOG_DIR/ms-fabric-mcp-$(date +%Y%m%d_%H%M%S).log\"; uvx ms-fabric-mcp-server 2> \"$LOG_FILE\""
],
"env": {
"AZURE_LOG_LEVEL": "info",
"MCP_LOG_LEVEL": "INFO"
}
}
}
}
VS Code (PowerShell):
{
"servers": {
"MS Fabric MCP Server": {
"type": "stdio",
"command": "powershell",
"args": [
"-NoProfile",
"-Command",
"$logDir=\"$env:USERPROFILE\\mcp_logs\"; New-Item -ItemType Directory -Force -Path $logDir | Out-Null; $ts=Get-Date -Format yyyyMMdd_HHmmss; $logFile=\"$logDir\\ms-fabric-mcp-$ts.log\"; uvx ms-fabric-mcp-server 2> $logFile"
],
"env": {
"AZURE_LOG_LEVEL": "info",
"MCP_LOG_LEVEL": "INFO"
}
}
}
}
Programmatic Usage (Library Mode)
from fastmcp import FastMCP
from ms_fabric_mcp_server import register_fabric_tools
# Create your own server
mcp = FastMCP("my-custom-server")
# Register all Fabric tools
register_fabric_tools(mcp)
# Add your own customizations...
mcp.run()
Configuration
Environment variables (all optional with sensible defaults):
| Variable | Default | Description |
|---|---|---|
FABRIC_BASE_URL |
https://api.fabric.microsoft.com/v1 |
Fabric API base URL |
FABRIC_SCOPES |
https://api.fabric.microsoft.com/.default |
OAuth scopes |
FABRIC_API_CALL_TIMEOUT |
30 |
API timeout (seconds) |
FABRIC_MAX_RETRIES |
3 |
Max retry attempts |
FABRIC_RETRY_BACKOFF |
2.0 |
Backoff factor |
LIVY_API_CALL_TIMEOUT |
120 |
Livy timeout (seconds) |
LIVY_POLL_INTERVAL |
2.0 |
Livy polling interval |
LIVY_STATEMENT_WAIT_TIMEOUT |
10 |
Livy statement wait timeout |
LIVY_SESSION_WAIT_TIMEOUT |
240 |
Livy session wait timeout |
MCP_SERVER_NAME |
ms-fabric-mcp-server |
Server name for MCP |
MCP_LOG_LEVEL |
INFO |
Logging level |
AZURE_LOG_LEVEL |
info |
Azure SDK logging level |
Copy .env.example to .env and customize as needed.
Available Tools
The server provides 35 core tools, with 3 additional SQL tools when installed with [sql] extras (38 total).
| Tool Group | Count | Tools |
|---|---|---|
| Workspace | 1 | list_workspaces |
| Item | 2 | list_items, delete_item |
| Notebook | 6 | import_notebook_to_fabric, get_notebook_content, attach_lakehouse_to_notebook, get_notebook_execution_details, list_notebook_executions, get_notebook_driver_logs |
| Job | 4 | run_on_demand_job, get_job_status, get_job_status_by_url, get_operation_result |
| Livy | 8 | livy_create_session, livy_list_sessions, livy_get_session_status, livy_close_session, livy_run_statement, livy_get_statement_status, livy_cancel_statement, livy_get_session_log |
| Pipeline | 5 | create_blank_pipeline, add_copy_activity_to_pipeline, add_notebook_activity_to_pipeline, add_dataflow_activity_to_pipeline, add_activity_to_pipeline |
| Semantic Model | 7 | create_semantic_model, add_table_to_semantic_model, add_relationship_to_semantic_model, get_semantic_model_details, get_semantic_model_definition, add_measures_to_semantic_model, delete_measures_from_semantic_model |
| Power BI | 2 | refresh_semantic_model, execute_dax_query |
| SQL (optional) | 3 | get_sql_endpoint, execute_sql_query, execute_sql_statement |
SQL Tools (Optional)
SQL tools require pyodbc and the Microsoft ODBC Driver for SQL Server:
# Install with SQL support
pip install ms-fabric-mcp-server[sql]
# On Ubuntu/Debian, install the ODBC driver first:
curl https://packages.microsoft.com/keys/microsoft.asc | sudo apt-key add -
curl https://packages.microsoft.com/config/ubuntu/$(lsb_release -rs)/prod.list | sudo tee /etc/apt/sources.list.d/mssql-release.list
sudo apt-get update
sudo ACCEPT_EULA=Y apt-get install -y msodbcsql18
If pyodbc is not available, the server starts with 35 tools (SQL tools disabled).
Development
# Clone and install with dev dependencies
git clone https://github.com/your-org/ms-fabric-mcp-server.git
cd ms-fabric-mcp-server
pip install -e ".[dev,sql,telemetry]"
# Run tests
pytest
# Run with coverage
pytest --cov
# Format code
black src tests
isort src tests
# Type checking
mypy src
Integration tests
Integration tests run against live Fabric resources and are opt-in.
To get started locally, copy the example env file:
cp .env.integration.example .env.integration
Required environment variables:
FABRIC_INTEGRATION_TESTS=1FABRIC_TEST_WORKSPACE_NAMEFABRIC_TEST_LAKEHOUSE_NAMEFABRIC_TEST_SQL_DATABASE
Optional pipeline copy inputs:
FABRIC_TEST_SOURCE_CONNECTION_IDFABRIC_TEST_SOURCE_TYPEFABRIC_TEST_SOURCE_SCHEMAFABRIC_TEST_SOURCE_TABLEFABRIC_TEST_DEST_CONNECTION_IDFABRIC_TEST_DEST_TABLE_NAME(optional override; defaults to source table name)
Run integration tests:
FABRIC_INTEGRATION_TESTS=1 pytest
Notes:
- SQL tests require
pyodbcand a SQL Server ODBC driver. - Tests may skip when optional dependencies or environment variables are missing.
- These tests use live Fabric resources and may incur costs or side effects.
License
MIT
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
E2B
Using MCP to run code via e2b.