Skip to main content

Model Context Protocol (MCP) server for Apache Airflow API integration. Provides comprehensive tools for managing Airflow clusters including service operations, configuration management, status monitoring, and request tracking.

Project description

🚀 MCP-Airflow-API: Revolutionary Open Source Tool for Managing Apache Airflow with Natural Language

Deploy to PyPI with tag

Have you ever wondered how amazing it would be if you could manage your Apache Airflow workflows using natural language instead of complex REST API calls or web interface manipulations? MCP-Airflow-API is the revolutionary open-source project that makes this goal a reality.

MCP-Airflow-API Screenshot


🎯 What is MCP-Airflow-API?

MCP-Airflow-API is an MCP server that leverages the Model Context Protocol (MCP) to transform Apache Airflow REST API operations into natural language tools. This project hides the complexity of API structures and enables intuitive management of Airflow clusters through natural language commands.

Traditional approach (example):

curl -X GET "http://localhost:8080/api/v1/dags?limit=100&offset=0" \
  -H "Authorization: Basic YWlyZmxvdzphaXJmbG93"

MCP-Airflow-API approach (natural language):

"Show me the currently running DAGs"


QuickStart: Get started in 5 minutes

git clone https://github.com/call518/MCP-Airflow-API.git
cd MCP-Airflow-API

# Copy environment template
cp .env.example .env

# Edit .env with your Airflow settings
vim .env

# Start services
docker-compose up -d

# Access services
# Open WebUI: http://localhost:3002
# MCP Proxy API: http://localhost:8002/docs

🌟 Key Features

  1. Natural Language Queries
    No need to learn complex API syntax. Just ask as you would naturally speak:

    • "What DAGs are currently running?"
    • "Show me the failed tasks"
    • "Find DAGs containing ETL"
  2. Comprehensive Monitoring Capabilities
    Real-time cluster status monitoring:

    • Cluster health monitoring
    • DAG status and performance analysis
    • Task execution log tracking
    • XCom data management
  3. 43 Powerful MCP Tools
    Covers almost all Airflow API functionality:

    • DAG management (trigger, pause, resume)
    • Task instance monitoring
    • Pool and variable management
    • Connection configuration
    • Configuration queries
    • Event log analysis
  4. Large Environment Optimization
    Efficiently handles large environments with 1000+ DAGs:

    • Smart pagination support
    • Advanced filtering options
    • Batch processing capabilities

🛠️ Technical Advantages

  • Leveraging Model Context Protocol (MCP)
    MCP is an open standard for secure connections between AI applications and data sources, providing:

    • Standardized interface
    • Secure data access
    • Scalable architecture
  • Support for Two Connection Modes

    • stdio mode: Traditional approach for local environments
    • streamable-http mode: Docker-based remote deployment
  • Complete Docker Support
    Full Docker Compose setup with 3 separate services:

    • Open WebUI: Web interface (port 3002)
    • MCP Server: Airflow API tools (port 8080)
    • MCPO Proxy: REST API endpoint provider (port 8002)

🚀 Real Usage Examples

DAG Management

# List all currently running DAGs
list_dags(limit=50, is_active=True)

# Search for DAGs containing specific keywords
list_dags(id_contains="etl", name_contains="daily")

# Trigger DAG immediately
trigger_dag("my_etl_pipeline")

Task Monitoring

# Query failed task instances
list_task_instances_all(state="failed", limit=20)

# Check logs for specific task
get_task_instance_logs(
    dag_id="my_dag", 
    dag_run_id="run_123", 
    task_id="extract_data"
)

Performance Analysis

# DAG execution time statistics
dag_run_duration("my_etl_pipeline", limit=50)

# Task-level performance analysis
dag_task_duration("my_etl_pipeline", "latest_run")

📊 Real-World Use Cases

Capacity Management for Operations Teams

Capacity Management for Operations Teams

Capacity Management for Operations Teams

Capacity Management for Operations Teams

Capacity Management for Operations Teams

Capacity Management for Operations Teams

Capacity Management for Operations Teams

Capacity Management for Operations Teams

Capacity Management for Operations Teams

Capacity Management for Operations Teams

Capacity Management for Operations Teams


🔧 Easy Installation and Setup

Simple Installation via PyPI

uvx --python 3.11 mcp-airflow-api

One-Click Deployment with Docker Compose (example)

version: '3.8'
services:
  mcp-server:
    build: 
      context: .
      dockerfile: Dockerfile.MCP-Server
    environment:
      - FASTMCP_PORT=8080
      - AIRFLOW_API_URL=http://your-airflow:8080/api/v1
      - AIRFLOW_API_USERNAME=airflow
      - AIRFLOW_API_PASSWORD=your-password

MCP Configuration File (example)

{
  "mcpServers": {
    "airflow-api": {
      "command": "uvx",
      "args": ["--python", "3.11", "mcp-airflow-api"],
      "env": {
        "AIRFLOW_API_URL": "http://localhost:8080/api/v1",
        "AIRFLOW_API_USERNAME": "airflow",
        "AIRFLOW_API_PASSWORD": "airflow"
      }
    }
  }
}

🌈 Future-Ready Architecture

  • Scalable design and modular structure for easy addition of new features
  • Standards-compliant protocol for integration with other tools
  • Cloud-native operations and LLM-ready interface
  • Context-aware query processing and automated workflow management capabilities

🎯 Who Is This Tool For?

  • Data Engineers — Reduce debugging time, improve productivity, minimize learning curve
  • DevOps Engineers — Automate infrastructure monitoring, reduce incident response time
  • System Administrators — User-friendly management without complex APIs, real-time cluster status monitoring

🚀 Open Source Contribution and Community

Repository: https://github.com/call518/MCP-Airflow-API

How to Contribute

  • Bug reports and feature suggestions
  • Documentation improvements
  • Code contributions

Please consider starring the project if you find it useful.


🔮 Conclusion

MCP-Airflow-API changes the paradigm of data engineering and workflow management:
No need to memorize REST API calls — just ask in natural language:

"Show me the status of currently running ETL jobs."


🏷️ Tags

#Apache-Airflow #MCP #ModelContextProtocol #DataEngineering #DevOps #WorkflowAutomation #NaturalLanguage #OpenSource #Python #Docker #AI-Integration


Contributing

🤝 Got ideas? Found bugs? Want to add cool features?

We're always excited to welcome new contributors! Whether you're fixing a typo, adding a new monitoring tool, or improving documentation - every contribution makes this project better.

Ways to contribute:

  • 🐛 Report issues or bugs
  • 💡 Suggest new PostgreSQL monitoring features
  • 📝 Improve documentation
  • 🚀 Submit pull requests
  • ⭐ Star the repo if you find it useful!

Pro tip: The codebase is designed to be super friendly for adding new tools. Check out the existing @mcp.tool() functions in airflow_api.py.


License

Freely use, modify, and distribute under the MIT License.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mcp_airflow_api-2.0.0.tar.gz (29.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

mcp_airflow_api-2.0.0-py3-none-any.whl (32.2 kB view details)

Uploaded Python 3

File details

Details for the file mcp_airflow_api-2.0.0.tar.gz.

File metadata

  • Download URL: mcp_airflow_api-2.0.0.tar.gz
  • Upload date:
  • Size: 29.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for mcp_airflow_api-2.0.0.tar.gz
Algorithm Hash digest
SHA256 643130fc58a079242dddf59253263166d2aeb6e9e33bb6de96796ab588fae39d
MD5 21091623e13044b2ad34e4057c677eca
BLAKE2b-256 2b440bfbb5a676016dc4bdb68600f2c0496ca0d5f08fde83575775ab8715cb10

See more details on using hashes here.

Provenance

The following attestation bundles were made for mcp_airflow_api-2.0.0.tar.gz:

Publisher: pypi-publish.yml on call518/MCP-Airflow-API

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file mcp_airflow_api-2.0.0-py3-none-any.whl.

File metadata

File hashes

Hashes for mcp_airflow_api-2.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 3f4cb5c0ddc35edb16fe9c44da7bbfad3b10e6a6e80f4bf3721ee2a747adcfc9
MD5 00b325264b4f5010b95fb25b12faaa6e
BLAKE2b-256 7a7b36a4d7366e5a6192943a666da383aef21f53edb147371c8271c564f3d583

See more details on using hashes here.

Provenance

The following attestation bundles were made for mcp_airflow_api-2.0.0-py3-none-any.whl:

Publisher: pypi-publish.yml on call518/MCP-Airflow-API

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page