Apache Airflow MCP Server

Apache Airflow MCP Server

An MCP server that wraps the Apache Airflow REST API, enabling clients to manage DAGs, monitor task instances, and handle workflows through a standardized interface. It provides comprehensive access to Airflow features including DAG runs, variables, connections, and XComs.

Category
Visit Server

README

MseeP.ai Security Assessment Badge

mcp-server-apache-airflow

smithery badge PyPI - Downloads

A Model Context Protocol (MCP) server implementation for Apache Airflow, enabling seamless integration with MCP clients. This project provides a standardized way to interact with Apache Airflow through the Model Context Protocol.

<a href="https://glama.ai/mcp/servers/e99b6vx9lw"> <img width="380" height="200" src="https://glama.ai/mcp/servers/e99b6vx9lw/badge" alt="Server for Apache Airflow MCP server" /> </a>

About

This project implements a Model Context Protocol server that wraps Apache Airflow's REST API, allowing MCP clients to interact with Airflow in a standardized way. It uses the official Apache Airflow client library to ensure compatibility and maintainability.

Feature Implementation Status

Feature API Path Status
DAG Management
List DAGs /api/v1/dags
Get DAG Details /api/v1/dags/{dag_id}
Pause DAG /api/v1/dags/{dag_id}
Unpause DAG /api/v1/dags/{dag_id}
Update DAG /api/v1/dags/{dag_id}
Delete DAG /api/v1/dags/{dag_id}
Get DAG Source /api/v1/dagSources/{file_token}
Patch Multiple DAGs /api/v1/dags
Reparse DAG File /api/v1/dagSources/{file_token}/reparse
DAG Runs
List DAG Runs /api/v1/dags/{dag_id}/dagRuns
Create DAG Run /api/v1/dags/{dag_id}/dagRuns
Get DAG Run Details /api/v1/dags/{dag_id}/dagRuns/{dag_run_id}
Update DAG Run /api/v1/dags/{dag_id}/dagRuns/{dag_run_id}
Delete DAG Run /api/v1/dags/{dag_id}/dagRuns/{dag_run_id}
Get DAG Runs Batch /api/v1/dags/~/dagRuns/list
Clear DAG Run /api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/clear
Set DAG Run Note /api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/setNote
Get Upstream Dataset Events /api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/upstreamDatasetEvents
Tasks
List DAG Tasks /api/v1/dags/{dag_id}/tasks
Get Task Details /api/v1/dags/{dag_id}/tasks/{task_id}
Get Task Instance /api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}
List Task Instances /api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances
Update Task Instance /api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}
Get Task Instance Log /api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/logs/{task_try_number}
Clear Task Instances /api/v1/dags/{dag_id}/clearTaskInstances
Set Task Instances State /api/v1/dags/{dag_id}/updateTaskInstancesState
List Task Instance Tries /api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/tries
Variables
List Variables /api/v1/variables
Create Variable /api/v1/variables
Get Variable /api/v1/variables/{variable_key}
Update Variable /api/v1/variables/{variable_key}
Delete Variable /api/v1/variables/{variable_key}
Connections
List Connections /api/v1/connections
Create Connection /api/v1/connections
Get Connection /api/v1/connections/{connection_id}
Update Connection /api/v1/connections/{connection_id}
Delete Connection /api/v1/connections/{connection_id}
Test Connection /api/v1/connections/test
Pools
List Pools /api/v1/pools
Create Pool /api/v1/pools
Get Pool /api/v1/pools/{pool_name}
Update Pool /api/v1/pools/{pool_name}
Delete Pool /api/v1/pools/{pool_name}
XComs
List XComs /api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/xcomEntries
Get XCom Entry /api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/xcomEntries/{xcom_key}
Datasets
List Datasets /api/v1/datasets
Get Dataset /api/v1/datasets/{uri}
Get Dataset Events /api/v1/datasetEvents
Create Dataset Event /api/v1/datasetEvents
Get DAG Dataset Queued Event /api/v1/dags/{dag_id}/dagRuns/queued/datasetEvents/{uri}
Get DAG Dataset Queued Events /api/v1/dags/{dag_id}/dagRuns/queued/datasetEvents
Delete DAG Dataset Queued Event /api/v1/dags/{dag_id}/dagRuns/queued/datasetEvents/{uri}
Delete DAG Dataset Queued Events /api/v1/dags/{dag_id}/dagRuns/queued/datasetEvents
Get Dataset Queued Events /api/v1/datasets/{uri}/dagRuns/queued/datasetEvents
Delete Dataset Queued Events /api/v1/datasets/{uri}/dagRuns/queued/datasetEvents
Monitoring
Get Health /api/v1/health
DAG Stats
Get DAG Stats /api/v1/dags/statistics
Config
Get Config /api/v1/config
Plugins
Get Plugins /api/v1/plugins
Providers
List Providers /api/v1/providers
Event Logs
List Event Logs /api/v1/eventLogs
Get Event Log /api/v1/eventLogs/{event_log_id}
System
Get Import Errors /api/v1/importErrors
Get Import Error Details /api/v1/importErrors/{import_error_id}
Get Health Status /api/v1/health
Get Version /api/v1/version

Setup

Dependencies

This project depends on the official Apache Airflow client library (apache-airflow-client). It will be automatically installed when you install this package.

Environment Variables

Set the following environment variables:

AIRFLOW_HOST=<your-airflow-host>        # Optional, defaults to http://localhost:8080
AIRFLOW_API_VERSION=v1                  # Optional, defaults to v1
READ_ONLY=true                          # Optional, enables read-only mode (true/false, defaults to false)

Authentication

Choose one of the following authentication methods:

Basic Authentication (default):

AIRFLOW_USERNAME=<your-airflow-username>
AIRFLOW_PASSWORD=<your-airflow-password>

JWT Token Authentication:

AIRFLOW_JWT_TOKEN=<your-jwt-token>

To obtain a JWT token, you can use Airflow's authentication endpoint:

ENDPOINT_URL="http://localhost:8080"  # Replace with your Airflow endpoint
curl -X 'POST' \
  "${ENDPOINT_URL}/auth/token" \
  -H 'Content-Type: application/json' \
  -d '{ "username": "<your-username>", "password": "<your-password>" }'

Note: If both JWT token and basic authentication credentials are provided, JWT token takes precedence.

Usage with Claude Desktop

Add to your claude_desktop_config.json:

Basic Authentication:

{
  "mcpServers": {
    "mcp-server-apache-airflow": {
      "command": "uvx",
      "args": ["mcp-server-apache-airflow"],
      "env": {
        "AIRFLOW_HOST": "https://your-airflow-host",
        "AIRFLOW_USERNAME": "your-username",
        "AIRFLOW_PASSWORD": "your-password"
      }
    }
  }
}

JWT Token Authentication:

{
  "mcpServers": {
    "mcp-server-apache-airflow": {
      "command": "uvx",
      "args": ["mcp-server-apache-airflow"],
      "env": {
        "AIRFLOW_HOST": "https://your-airflow-host",
        "AIRFLOW_JWT_TOKEN": "your-jwt-token"
      }
    }
  }
}

For read-only mode (recommended for safety):

Basic Authentication:

{
  "mcpServers": {
    "mcp-server-apache-airflow": {
      "command": "uvx",
      "args": ["mcp-server-apache-airflow"],
      "env": {
        "AIRFLOW_HOST": "https://your-airflow-host",
        "AIRFLOW_USERNAME": "your-username",
        "AIRFLOW_PASSWORD": "your-password",
        "READ_ONLY": "true"
      }
    }
  }
}

JWT Token Authentication:

{
  "mcpServers": {
    "mcp-server-apache-airflow": {
      "command": "uvx",
      "args": ["mcp-server-apache-airflow", "--read-only"],
      "env": {
        "AIRFLOW_HOST": "https://your-airflow-host",
        "AIRFLOW_JWT_TOKEN": "your-jwt-token"
      }
    }
  }
}

Alternative configuration using uv:

Basic Authentication:

{
  "mcpServers": {
    "mcp-server-apache-airflow": {
      "command": "uv",
      "args": [
        "--directory",
        "/path/to/mcp-server-apache-airflow",
        "run",
        "mcp-server-apache-airflow"
      ],
      "env": {
        "AIRFLOW_HOST": "https://your-airflow-host",
        "AIRFLOW_USERNAME": "your-username",
        "AIRFLOW_PASSWORD": "your-password"
      }
    }
  }
}

JWT Token Authentication:

{
  "mcpServers": {
    "mcp-server-apache-airflow": {
      "command": "uv",
      "args": [
        "--directory",
        "/path/to/mcp-server-apache-airflow",
        "run",
        "mcp-server-apache-airflow"
      ],
      "env": {
        "AIRFLOW_HOST": "https://your-airflow-host",
        "AIRFLOW_JWT_TOKEN": "your-jwt-token"
      }
    }
  }
}

Replace /path/to/mcp-server-apache-airflow with the actual path where you've cloned the repository.

Selecting the API groups

You can select the API groups you want to use by setting the --apis flag.

uv run mcp-server-apache-airflow --apis dag --apis dagrun

The default is to use all APIs.

Allowed values are:

  • config
  • connections
  • dag
  • dagrun
  • dagstats
  • dataset
  • eventlog
  • importerror
  • monitoring
  • plugin
  • pool
  • provider
  • taskinstance
  • variable
  • xcom

Read-Only Mode

You can run the server in read-only mode by using the --read-only flag or by setting the READ_ONLY=true environment variable. This will only expose tools that perform read operations (GET requests) and exclude any tools that create, update, or delete resources.

Using the command-line flag:

uv run mcp-server-apache-airflow --read-only

Using the environment variable:

READ_ONLY=true uv run mcp-server-apache-airflow

In read-only mode, the server will only expose tools like:

  • Listing DAGs, DAG runs, tasks, variables, connections, etc.
  • Getting details of specific resources
  • Reading configurations and monitoring information
  • Testing connections (non-destructive)

Write operations like creating, updating, deleting DAGs, variables, connections, triggering DAG runs, etc. will not be available in read-only mode.

You can combine read-only mode with API group selection:

uv run mcp-server-apache-airflow --read-only --apis dag --apis variable

Manual Execution

You can also run the server manually:

make run

make run accepts following options:

Options:

  • --port: Port to listen on for SSE (default: 8000)
  • --transport: Transport type (stdio/sse/http, default: stdio)

Or, you could run the sse server directly, which accepts same parameters:

make run-sse

Also, you could start service directly using uv like in the following command:

uv run src --transport http --port 8080

Installing via Smithery

To install Apache Airflow MCP Server for Claude Desktop automatically via Smithery:

npx -y @smithery/cli install @yangkyeongmo/mcp-server-apache-airflow --client claude

Development

Setting up Development Environment

  1. Clone the repository:
git clone https://github.com/yangkyeongmo/mcp-server-apache-airflow.git
cd mcp-server-apache-airflow
  1. Install development dependencies:
uv sync --dev
  1. Create a .env file for environment variables (optional for development):
touch .env

Note: No environment variables are required for running tests. The AIRFLOW_HOST defaults to http://localhost:8080 for development and testing purposes.

Running Tests

The project uses pytest for testing with the following commands available:

# Run all tests
make test

Code Quality

# Run linting
make lint

# Run code formatting
make format

Continuous Integration

The project includes a GitHub Actions workflow (.github/workflows/test.yml) that automatically:

  • Runs tests on Python 3.10, 3.11, and 3.12
  • Executes linting checks using ruff
  • Runs on every push and pull request to main branch

The CI pipeline ensures code quality and compatibility across supported Python versions before any changes are merged.

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

The package is deployed automatically to PyPI when project.version is updated in pyproject.toml. Follow semver for versioning.

Please include version update in the PR in order to apply the changes to core logic.

License

MIT License

Recommended Servers

playwright-mcp

playwright-mcp

A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.

Official
Featured
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.

Official
Featured
Local
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.

Official
Featured
Local
TypeScript
VeyraX MCP

VeyraX MCP

Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.

Official
Featured
Local
graphlit-mcp-server

graphlit-mcp-server

The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.

Official
Featured
TypeScript
Kagi MCP Server

Kagi MCP Server

An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.

Official
Featured
Python
E2B

E2B

Using MCP to run code via e2b.

Official
Featured
Neon Database

Neon Database

MCP server for interacting with Neon Management API and databases

Official
Featured
Exa Search

Exa Search

A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.

Official
Featured
Qdrant Server

Qdrant Server

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Official
Featured