Model Context Protocol (MCP) server for Apache Airflow API integration. Provides comprehensive tools for managing Airflow clusters including service operations, configuration management, status monitoring, and request tracking.
Project description
🚀 MCP-Airflow-API: Revolutionary Open Source Tool for Managing Apache Airflow with Natural Language
Have you ever wondered how amazing it would be if you could manage your Apache Airflow workflows using natural language instead of complex REST API calls or web interface manipulations? MCP-Airflow-API is the revolutionary open-source project that makes this goal a reality.
🎯 What is MCP-Airflow-API?
MCP-Airflow-API is an MCP server that leverages the Model Context Protocol (MCP) to transform Apache Airflow REST API operations into natural language tools. This project hides the complexity of API structures and enables intuitive management of Airflow clusters through natural language commands.
Traditional approach (example):
curl -X GET "http://localhost:8080/api/v1/dags?limit=100&offset=0" \
-H "Authorization: Basic YWlyZmxvdzphaXJmbG93"
MCP-Airflow-API approach (natural language):
"Show me the currently running DAGs"
⭐ QuickStart: Get started in 5 minutes
1. Environment Setup
git clone https://github.com/call518/MCP-Airflow-API.git
cd MCP-Airflow-API
### Check and modify .env file
cp .env.example .env
### Airflow API Configuration
AIRFLOW_API_URL=http://host.docker.internal:38080/api/v1
AIRFLOW_API_USERNAME=airflow
AIRFLOW_API_PASSWORD=changeme!@34
2. Start Demo Containers
# Start all containers
docker-compose up -d
3. Access to OpenWebUI
- The list of MCP tool features provided by
swaggercan be found in the MCPO API Docs URL.- e.g:
http://localhost:8002/docs
- e.g:
4. Registering the Tool in OpenWebUI
- logging in to OpenWebUI with an admin account
- go to "Settings" → "Tools" from the top menu.
- Enter the
airflow-apiTool address (e.g.,http://localhost:8002/airflow-api) to connect MCP Tools. - Setup Ollama or OpenAI.
🌟 Key Features
-
Natural Language Queries
No need to learn complex API syntax. Just ask as you would naturally speak:- "What DAGs are currently running?"
- "Show me the failed tasks"
- "Find DAGs containing ETL"
-
Comprehensive Monitoring Capabilities
Real-time cluster status monitoring:- Cluster health monitoring
- DAG status and performance analysis
- Task execution log tracking
- XCom data management
-
43 Powerful MCP Tools
Covers almost all Airflow API functionality:- DAG management (trigger, pause, resume)
- Task instance monitoring
- Pool and variable management
- Connection configuration
- Configuration queries
- Event log analysis
-
Large Environment Optimization
Efficiently handles large environments with 1000+ DAGs:- Smart pagination support
- Advanced filtering options
- Batch processing capabilities
🛠️ Technical Advantages
-
Leveraging Model Context Protocol (MCP)
MCP is an open standard for secure connections between AI applications and data sources, providing:- Standardized interface
- Secure data access
- Scalable architecture
-
Support for Two Connection Modes
stdiomode: Traditional approach for local environmentsstreamable-httpmode: Docker-based remote deployment
-
Complete Docker Support
Full Docker Compose setup with 3 separate services:- Open WebUI: Web interface (port
3002) - MCP Server: Airflow API tools (port
8080) - MCPO Proxy: REST API endpoint provider (port
8002)
- Open WebUI: Web interface (port
🚀 Real Usage Examples
DAG Management
# List all currently running DAGs
list_dags(limit=50, is_active=True)
# Search for DAGs containing specific keywords
list_dags(id_contains="etl", name_contains="daily")
# Trigger DAG immediately
trigger_dag("my_etl_pipeline")
Task Monitoring
# Query failed task instances
list_task_instances_all(state="failed", limit=20)
# Check logs for specific task
get_task_instance_logs(
dag_id="my_dag",
dag_run_id="run_123",
task_id="extract_data"
)
Performance Analysis
# DAG execution time statistics
dag_run_duration("my_etl_pipeline", limit=50)
# Task-level performance analysis
dag_task_duration("my_etl_pipeline", "latest_run")
📊 Real-World Use Cases
🔧 Easy Installation and Setup
Simple Installation via PyPI
uvx --python 3.11 mcp-airflow-api
One-Click Deployment with Docker Compose (example)
version: '3.8'
services:
mcp-server:
build:
context: .
dockerfile: Dockerfile.MCP-Server
environment:
- FASTMCP_PORT=8080
- AIRFLOW_API_URL=http://your-airflow:8080/api/v1
- AIRFLOW_API_USERNAME=airflow
- AIRFLOW_API_PASSWORD=your-password
MCP Configuration File (example)
{
"mcpServers": {
"airflow-api": {
"command": "uvx",
"args": ["--python", "3.11", "mcp-airflow-api"],
"env": {
"AIRFLOW_API_URL": "http://localhost:8080/api/v1",
"AIRFLOW_API_USERNAME": "airflow",
"AIRFLOW_API_PASSWORD": "airflow"
}
}
}
}
🌈 Future-Ready Architecture
- Scalable design and modular structure for easy addition of new features
- Standards-compliant protocol for integration with other tools
- Cloud-native operations and LLM-ready interface
- Context-aware query processing and automated workflow management capabilities
🎯 Who Is This Tool For?
- Data Engineers — Reduce debugging time, improve productivity, minimize learning curve
- DevOps Engineers — Automate infrastructure monitoring, reduce incident response time
- System Administrators — User-friendly management without complex APIs, real-time cluster status monitoring
🚀 Open Source Contribution and Community
Repository: https://github.com/call518/MCP-Airflow-API
How to Contribute
- Bug reports and feature suggestions
- Documentation improvements
- Code contributions
Please consider starring the project if you find it useful.
🔮 Conclusion
MCP-Airflow-API changes the paradigm of data engineering and workflow management:
No need to memorize REST API calls — just ask in natural language:
"Show me the status of currently running ETL jobs."
🏷️ Tags
#Apache-Airflow #MCP #ModelContextProtocol #DataEngineering #DevOps #WorkflowAutomation #NaturalLanguage #OpenSource #Python #Docker #AI-Integration
Contributing
🤝 Got ideas? Found bugs? Want to add cool features?
We're always excited to welcome new contributors! Whether you're fixing a typo, adding a new monitoring tool, or improving documentation - every contribution makes this project better.
Ways to contribute:
- 🐛 Report issues or bugs
- 💡 Suggest new PostgreSQL monitoring features
- 📝 Improve documentation
- 🚀 Submit pull requests
- ⭐ Star the repo if you find it useful!
Pro tip: The codebase is designed to be super friendly for adding new tools. Check out the existing @mcp.tool() functions in airflow_api.py.
License
Freely use, modify, and distribute under the MIT License.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file mcp_airflow_api-2.0.2.tar.gz.
File metadata
- Download URL: mcp_airflow_api-2.0.2.tar.gz
- Upload date:
- Size: 29.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c1358bb3fd6492b144bb3a2028af675e8bd40ddb835b1f8db39721b538c6560c
|
|
| MD5 |
21c47e3161f50ce2a7b19a0795ae8217
|
|
| BLAKE2b-256 |
45c9ab1c25cbad80745c265455696185fa45aaea59bd75a8628b013749eb86f5
|
Provenance
The following attestation bundles were made for mcp_airflow_api-2.0.2.tar.gz:
Publisher:
pypi-publish.yml on call518/MCP-Airflow-API
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
mcp_airflow_api-2.0.2.tar.gz -
Subject digest:
c1358bb3fd6492b144bb3a2028af675e8bd40ddb835b1f8db39721b538c6560c - Sigstore transparency entry: 425020525
- Sigstore integration time:
-
Permalink:
call518/MCP-Airflow-API@d3177e3b9e3dbf59c9d09ce9824aa5f3e2f75696 -
Branch / Tag:
refs/tags/2.0.2 - Owner: https://github.com/call518
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
pypi-publish.yml@d3177e3b9e3dbf59c9d09ce9824aa5f3e2f75696 -
Trigger Event:
push
-
Statement type:
File details
Details for the file mcp_airflow_api-2.0.2-py3-none-any.whl.
File metadata
- Download URL: mcp_airflow_api-2.0.2-py3-none-any.whl
- Upload date:
- Size: 32.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
2d9076988e11a705ca73da2db8e07ea3ae9bee99fde77511d3e76d6b63a1fda9
|
|
| MD5 |
3c00062813e05d7259b479dbb869371f
|
|
| BLAKE2b-256 |
2d839489150dedea802febde82f018ca3360c28768412b6af101ebd1a39e73d7
|
Provenance
The following attestation bundles were made for mcp_airflow_api-2.0.2-py3-none-any.whl:
Publisher:
pypi-publish.yml on call518/MCP-Airflow-API
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
mcp_airflow_api-2.0.2-py3-none-any.whl -
Subject digest:
2d9076988e11a705ca73da2db8e07ea3ae9bee99fde77511d3e76d6b63a1fda9 - Sigstore transparency entry: 425020550
- Sigstore integration time:
-
Permalink:
call518/MCP-Airflow-API@d3177e3b9e3dbf59c9d09ce9824aa5f3e2f75696 -
Branch / Tag:
refs/tags/2.0.2 - Owner: https://github.com/call518
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
pypi-publish.yml@d3177e3b9e3dbf59c9d09ce9824aa5f3e2f75696 -
Trigger Event:
push
-
Statement type: