A FastMCP server for Airflow integration that can run standalone or as an Airflow 3 plugin
Project description
Table of Contents generated with DocToc
- Airflow MCP Server
Airflow MCP Server
A Model Context Protocol (MCP) server for Apache Airflow that provides AI assistants with access to Airflow's REST API. Built with FastMCP.
Quickstart
IDEs
Manual configuration
Add to your MCP settings (Cursor: ~/.cursor/mcp.json, VS Code: .vscode/mcp.json):
{
"mcpServers": {
"airflow": {
"command": "uvx",
"args": ["astro-airflow-mcp", "--transport", "stdio"]
}
}
}
CLI Tools
Claude Code
claude mcp add airflow -- uvx astro-airflow-mcp --transport stdio
Gemini CLI
gemini mcp add airflow -- uvx astro-airflow-mcp --transport stdio
Codex CLI
codex mcp add airflow -- uvx astro-airflow-mcp --transport stdio
Desktop Apps
Claude Desktop
Add to ~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows):
{
"mcpServers": {
"airflow": {
"command": "uvx",
"args": ["astro-airflow-mcp", "--transport", "stdio"]
}
}
}
Other MCP Clients
Manual JSON Configuration
Add to your MCP configuration file:
{
"mcpServers": {
"airflow": {
"command": "uvx",
"args": ["astro-airflow-mcp", "--transport", "stdio"]
}
}
}
Or connect to a running HTTP server: "url": "http://localhost:8000/mcp"
Note: No installation required -
uvxruns directly from PyPI. The--transport stdioflag is required because the server defaults to HTTP mode.
Configuration
By default, the server connects to http://localhost:8080 (Airflow default; also used by Astro CLI). Set environment variables for custom Airflow instances:
| Variable | Description |
|---|---|
AIRFLOW_API_URL |
Airflow webserver URL |
AIRFLOW_USERNAME |
Username (Airflow 3.x uses OAuth2 token exchange) |
AIRFLOW_PASSWORD |
Password |
AIRFLOW_AUTH_TOKEN |
Bearer token (alternative to username/password) |
AIRFLOW_VERIFY_SSL |
Set to false to disable SSL certificate verification |
AIRFLOW_CA_CERT |
Path to custom CA certificate bundle |
AF_READ_ONLY |
Set to true to block all write operations |
Example with auth (Claude Code):
claude mcp add airflow -e AIRFLOW_API_URL=https://your-airflow.example.com -e AIRFLOW_USERNAME=admin -e AIRFLOW_PASSWORD=admin -- uvx astro-airflow-mcp --transport stdio
Features
- Airflow 2.x and 3.x Support: Automatic version detection with adapter pattern
- MCP Tools for accessing Airflow data:
- DAG management (list, get details, get source code, stats, warnings, import errors, trigger, pause/unpause)
- DAG run management (list, get, trigger, trigger and wait, delete, clear)
- Task management (list, get details, get task instances, get logs, clear task instances)
- Pool management (list, get details)
- Variable management (list, get specific variables)
- Connection management (list connections with credentials excluded)
- Asset/Dataset management (unified naming across versions, data lineage)
- Plugin and provider information
- Configuration and version details
- Consolidated Tools for agent workflows:
explore_dag: Get comprehensive DAG information in one calldiagnose_dag_run: Debug failed DAG runs with task instance detailsget_system_health: System overview with health, errors, and warnings
- MCP Resources: Static Airflow info exposed as resources (version, providers, plugins, config)
- MCP Prompts: Guided workflows for common tasks (troubleshooting, health checks, onboarding)
- Dual deployment modes:
- Standalone server: Run as an independent MCP server
- Airflow plugin: Integrate directly into Airflow 3.x webserver
- Flexible Authentication:
- Bearer token (Airflow 2.x and 3.x)
- Username/password with automatic OAuth2 token exchange (Airflow 3.x)
- Basic auth (Airflow 2.x)
Available Tools
Consolidated Tools (Agent-Optimized)
| Tool | Description |
|---|---|
explore_dag |
Get comprehensive DAG info: metadata, tasks, recent runs, source code |
diagnose_dag_run |
Debug a DAG run: run details, failed task instances, logs |
get_system_health |
System overview: health status, import errors, warnings, DAG stats |
Core Tools
| Tool | Description |
|---|---|
list_dags |
Get all DAGs and their metadata |
get_dag_details |
Get detailed info about a specific DAG |
get_dag_source |
Get the source code of a DAG |
get_dag_stats |
Get DAG run statistics (Airflow 3.x only) |
list_dag_warnings |
Get DAG import warnings |
list_import_errors |
Get import errors from DAG files that failed to parse |
list_dag_runs |
Get DAG run history |
get_dag_run |
Get specific DAG run details |
trigger_dag |
Trigger a new DAG run (start a workflow execution) |
trigger_dag_and_wait |
Trigger a DAG run and wait for completion |
delete_dag_run |
Permanently delete a specific DAG run |
clear_dag_run |
Clear a DAG run to allow re-execution of all its tasks |
pause_dag |
Pause a DAG to prevent new scheduled runs |
unpause_dag |
Unpause a DAG to resume scheduled runs |
list_tasks |
Get all tasks in a DAG |
get_task |
Get details about a specific task |
get_task_instance |
Get task instance execution details |
get_task_logs |
Get logs for a specific task instance execution |
clear_task_instances |
Clear task instances to allow re-execution |
list_pools |
Get all resource pools |
get_pool |
Get details about a specific pool |
list_variables |
Get all Airflow variables |
get_variable |
Get a specific variable by key |
list_connections |
Get all connections (credentials excluded for security) |
list_assets |
Get assets/datasets (unified naming across versions) |
list_asset_events |
Get asset/dataset events |
get_upstream_asset_events |
Get asset events that triggered a specific DAG run |
list_plugins |
Get installed Airflow plugins |
list_providers |
Get installed provider packages |
get_airflow_config |
Get Airflow configuration |
get_airflow_version |
Get Airflow version information |
MCP Resources
| Resource URI | Description |
|---|---|
airflow://version |
Airflow version information |
airflow://providers |
Installed provider packages |
airflow://plugins |
Installed Airflow plugins |
airflow://config |
Airflow configuration |
MCP Prompts
| Prompt | Description |
|---|---|
troubleshoot_failed_dag |
Guided workflow for diagnosing DAG failures |
daily_health_check |
Morning health check routine |
onboard_new_dag |
Guide for understanding a new DAG |
Airflow CLI Tool
This package also includes af, a command-line tool for interacting with Airflow instances directly from your terminal.
Installation
# Install with uv
uv tool install astro-airflow-mcp
# Or use uvx to run without installing
uvx --from astro-airflow-mcp@latest af --help
Quick Reference
# System health check
af health
# DAG operations
af dags list
af dags get <dag_id>
af dags explore <dag_id> # Full investigation (metadata + tasks + source)
af dags source <dag_id>
af dags stats # DAG run statistics by state
af dags pause <dag_id>
af dags unpause <dag_id>
af dags errors # Import errors
af dags warnings
# Run operations
af runs list --dag-id <dag_id>
af runs get <dag_id> <run_id>
af runs trigger <dag_id>
af runs trigger-wait <dag_id> # Trigger and wait for completion
af runs delete <dag_id> <run_id> # Permanently delete a run
af runs clear <dag_id> <run_id> # Clear a run for re-execution
af runs diagnose <dag_id> <run_id>
# Task operations
af tasks list <dag_id>
af tasks get <dag_id> <task_id>
af tasks instance <dag_id> <run_id> <task_id> # Task execution details
af tasks logs <dag_id> <run_id> <task_id>
af tasks clear <dag_id> <run_id> <task_ids> # Clear task instances
# Asset operations
af assets list # List assets/datasets
af assets events # List asset events
# Config operations
af config show # Full Airflow configuration
af config version
af config connections
af config variables
af config variable <key> # Get specific variable
af config pools
af config pool <name> # Get specific pool
af config plugins # List installed plugins
af config providers # List installed providers
# Direct API access (any endpoint)
af api ls # List all available endpoints
af api ls --filter variable # Filter endpoints by pattern
af api dags # GET /api/v{1,2}/dags
af api dags -F limit=10 # With query parameters
af api variables -X POST -F key=x -f value=y # Create variable
af api variables/x -X DELETE # Delete variable
Instance Management
Manage multiple Airflow instances with persistent configuration:
# Add instances (auth is optional for open instances)
af instance add local --url http://localhost:8080
af instance add staging --url https://staging.example.com --username admin --password secret
af instance add prod --url https://prod.example.com --token '${AIRFLOW_PROD_TOKEN}'
# SSL options for self-signed or corporate CA certificates
af instance add corp --url https://airflow.corp.example.com --no-verify-ssl --username admin --password secret
af instance add corp --url https://airflow.corp.example.com --ca-cert /path/to/ca-bundle.pem --token '${TOKEN}'
# List and switch instances
af instance list # Shows all instances in a table
af instance use prod # Switch to prod instance
af instance current # Show current instance
af instance delete old-instance
af instance reset # Reset to default configuration
Instance Discovery
Auto-discover Airflow instances from Astro Cloud or local Docker environments:
# Preview discoverable instances (safe, read-only)
af instance discover --dry-run
# Discover from all backends (Astro Cloud + local)
af instance discover
# Discover Astro deployments only
af instance discover astro
# Include all accessible workspaces
af instance discover astro --all-workspaces
# Discover local Airflow instances (scans common ports)
af instance discover local
# Deep scan all ports for local instances
af instance discover local --scan
Note: Always run with
--dry-runfirst. The Astro discovery backend creates API tokens in Astro Cloud, so review the list before confirming.
Config file location: ~/.af/config.yaml (override with --config or AF_CONFIG env var)
Direct API Access
The af api command provides direct access to any Airflow REST API endpoint, similar to gh api for GitHub:
# Discover available endpoints
af api ls
af api ls --filter variable
# GET requests (default)
af api dags
af api dags -F limit=10 -F only_active=true
af api dags/my_dag
# POST/PATCH/DELETE requests
af api variables -X POST -F key=my_var -f value="my value"
af api dags/my_dag -X PATCH -F is_paused=false
af api variables/old_var -X DELETE
# With JSON body
af api connections -X POST --body '{"connection_id": "x", "conn_type": "postgres"}'
# Include response headers
af api dags -i
# Access non-versioned endpoints
af api health --raw
# Get full OpenAPI spec
af api spec
Field syntax:
-F key=value: Auto-converts types (numbers, booleans, null)-f key=value: Keeps value as raw string--body '{}': Raw JSON body for complex objects
instances:
- name: local
url: http://localhost:8080
auth: null
- name: staging
url: https://staging.example.com
auth:
username: admin
password: secret
- name: prod
url: https://prod.example.com
auth:
token: ${AIRFLOW_PROD_TOKEN} # Environment variable interpolation
- name: corporate
url: https://airflow.corp.example.com
auth:
username: admin
password: secret
verify-ssl: false # Disable SSL verification (self-signed certs)
# ca-cert: /path/to/ca.pem # Or provide a custom CA bundle
current-instance: local
Configuration
Configure connections via environment variables:
# Environment variables
export AIRFLOW_API_URL=http://localhost:8080
export AIRFLOW_USERNAME=admin
export AIRFLOW_PASSWORD=admin
# Or inline for one-off commands
AIRFLOW_API_URL=http://localhost:5500 af dags list
All commands output JSON (except instance commands which use human-readable tables), making them easy to use with tools like jq:
# Find failed runs
af runs list | jq '.dag_runs[] | select(.state == "failed")'
# Get DAG IDs only
af dags list | jq '.dags[].dag_id'
Advanced Usage
Running as Standalone Server
For HTTP-based integrations or connecting multiple clients to one server:
# Run server (HTTP mode is default)
# Configure via environment variables
AIRFLOW_API_URL=https://my-airflow.example.com AIRFLOW_USERNAME=admin AIRFLOW_PASSWORD=admin uvx astro-airflow-mcp
Connect MCP clients to: http://localhost:8000/mcp
Airflow Plugin Mode
Install into your Airflow 3.x environment to expose MCP at http://your-airflow:8080/mcp/v1:
# Add to your Airflow project (Astro Runtime or open-source Airflow 3.x)
echo astro-airflow-mcp >> requirements.txt
CLI Options
MCP Server Options:
| Flag | Environment Variable | Default | Description |
|---|---|---|---|
--transport |
MCP_TRANSPORT |
stdio |
Transport mode (stdio or http) |
--host |
MCP_HOST |
localhost |
Host to bind to (HTTP mode only) |
--port |
MCP_PORT |
8000 |
Port to bind to (HTTP mode only) |
--airflow-project-dir |
AIRFLOW_PROJECT_DIR |
$PWD |
Astro project directory for auto-discovering Airflow URL |
--no-verify-ssl |
AIRFLOW_VERIFY_SSL=false |
off | Disable SSL certificate verification |
--ca-cert |
AIRFLOW_CA_CERT |
None |
Path to custom CA certificate bundle |
Airflow Connection (Environment Variables):
| Variable | Default | Description |
|---|---|---|
AIRFLOW_API_URL |
http://localhost:8080 |
Airflow webserver URL |
AIRFLOW_AUTH_TOKEN |
None |
Bearer token for authentication |
AIRFLOW_USERNAME |
None |
Username for authentication |
AIRFLOW_PASSWORD |
None |
Password for authentication |
AIRFLOW_VERIFY_SSL |
true |
Set to false to disable SSL verification |
AIRFLOW_CA_CERT |
None |
Path to custom CA certificate bundle |
af CLI Options:
| Flag | Environment Variable | Description |
|---|---|---|
--config, -c |
AF_CONFIG |
Path to config file (default: ~/.af/config.yaml) |
--version, -v |
Show version and exit |
Telemetry
The af CLI collects anonymous usage telemetry to help improve the tool. Only the command name is collected (e.g., dags list), never the arguments or their values. No personally identifiable information is collected.
To opt out:
af telemetry disable
You can also disable telemetry by setting the AF_TELEMETRY_DISABLED=1 environment variable.
Architecture
The server is built using FastMCP with an adapter pattern for Airflow version compatibility:
Core Components
- Adapters (
adapters/): Version-specific API implementationsAirflowAdapter(base): Abstract interface for all Airflow API operationsAirflowV2Adapter: Airflow 2.x API (/api/v1) with basic authAirflowV3Adapter: Airflow 3.x API (/api/v2) with OAuth2 token exchange
- Version Detection: Automatic detection at startup by probing API endpoints
- Models (
models.py): Pydantic models for type-safe API responses
Version Handling Strategy
- Major versions (2.x vs 3.x): Adapter pattern with runtime version detection
- Minor versions (3.1 vs 3.2): Runtime feature detection with graceful fallbacks
- New API parameters: Pass-through
**kwargsfor forward compatibility
Deployment Modes
- Standalone: Independent ASGI application with HTTP/SSE transport
- Plugin: Mounted into Airflow 3.x FastAPI webserver
Development
# Setup development environment
make install-dev
# Run tests
make test
# Run all checks
make check
# Local testing with Astro CLI
astro dev start # Start Airflow
make run # Run MCP server (connects to localhost:8080)
Contributing
Contributions welcome! Please ensure:
- All tests pass (
make test) - Code passes linting (
make check) - prek hooks pass (
make prek)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file astro_airflow_mcp-0.5.0.tar.gz.
File metadata
- Download URL: astro_airflow_mcp-0.5.0.tar.gz
- Upload date:
- Size: 305.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
310d2221b87e6adc5a5887b189a527fad32e99b7dae79f5d50a0d2e89ab17002
|
|
| MD5 |
3ccaa11760ea124a0802135b1af20c4b
|
|
| BLAKE2b-256 |
53b9e598ca2dcbc1f98669d85ccb0f553b2c8a4de9cfc30a0cd1376cc6e27be6
|
Provenance
The following attestation bundles were made for astro_airflow_mcp-0.5.0.tar.gz:
Publisher:
astro-airflow-mcp-publish.yml on astronomer/agents
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
astro_airflow_mcp-0.5.0.tar.gz -
Subject digest:
310d2221b87e6adc5a5887b189a527fad32e99b7dae79f5d50a0d2e89ab17002 - Sigstore transparency entry: 1000549090
- Sigstore integration time:
-
Permalink:
astronomer/agents@e57e436c9f5564306e307273ca49254551afeb70 -
Branch / Tag:
refs/tags/astro-airflow-mcp-0.5.0 - Owner: https://github.com/astronomer
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
astro-airflow-mcp-publish.yml@e57e436c9f5564306e307273ca49254551afeb70 -
Trigger Event:
release
-
Statement type:
File details
Details for the file astro_airflow_mcp-0.5.0-py3-none-any.whl.
File metadata
- Download URL: astro_airflow_mcp-0.5.0-py3-none-any.whl
- Upload date:
- Size: 105.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
38ccfd9294b45cea948d53946fc3f5fde236feb046827dbf0ca4954ba5ef7bb3
|
|
| MD5 |
7694b09858c050951ca7a1f2823afaf8
|
|
| BLAKE2b-256 |
409982cc1c12517a5f60fe9578a256add7ed33a1bf8678cc527e24098ad5f62b
|
Provenance
The following attestation bundles were made for astro_airflow_mcp-0.5.0-py3-none-any.whl:
Publisher:
astro-airflow-mcp-publish.yml on astronomer/agents
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
astro_airflow_mcp-0.5.0-py3-none-any.whl -
Subject digest:
38ccfd9294b45cea948d53946fc3f5fde236feb046827dbf0ca4954ba5ef7bb3 - Sigstore transparency entry: 1000549290
- Sigstore integration time:
-
Permalink:
astronomer/agents@e57e436c9f5564306e307273ca49254551afeb70 -
Branch / Tag:
refs/tags/astro-airflow-mcp-0.5.0 - Owner: https://github.com/astronomer
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
astro-airflow-mcp-publish.yml@e57e436c9f5564306e307273ca49254551afeb70 -
Trigger Event:
release
-
Statement type: