Check for compatibility between Airflow versions
Project description
Apache Airflow Upgrade Check
This package aims to easy the upgrade journey from Apache Airflow 1.10 to 2.0.
While we have put a lot of effort in to making this upgrade as painless as possible, with many changes providing upgrade path (where the old code continues to work and prints out a deprecation warning) there were unfortunately some breaking changes where we couldn't provide a compatibility shim.
The recommended upgrade path to get to Airflow 2.0.0 is to first upgrade to the latest release in the 1.10 series (at the time of writing: 1.10.15) and to then run this script.
pip install apache-airflow-upgrade-check
airflow upgrade_check
This will then print out a number of action items that you should follow before upgrading to 2.0.0 or above.
The exit code of the command will be 0 (success) if no problems are reported, or 1 otherwise.
For example:
============================================= STATUS =============================================
Check for latest versions of apache-airflow and checker.................................SUCCESS
Legacy UI is deprecated by default......................................................SUCCESS
Users must set a kubernetes.pod_template_file value.....................................FAIL
Changes in import paths of hooks, operators, sensors and others.........................FAIL
Remove airflow.AirflowMacroPlugin class.................................................SUCCESS
Check versions of PostgreSQL, MySQL, and SQLite to ease upgrade to Airflow 2.0..........SUCCESS
Fernet is enabled by default............................................................FAIL
Logging configuration has been moved to new section.....................................SUCCESS
Connection.conn_id is not unique........................................................SUCCESS
GCP service account key deprecation.....................................................SUCCESS
Users must delete deprecated configs for KubernetesExecutor.............................FAIL
Changes in import path of remote task handlers..........................................SUCCESS
Chain between DAG and operator not allowed..............................................SUCCESS
SendGrid email uses old airflow.contrib module..........................................SUCCESS
Connection.conn_type is not nullable....................................................SUCCESS
Found 16 problems.
======================================== RECOMMENDATIONS =========================================
Users must set a kubernetes.pod_template_file value
---------------------------------------------------
In Airflow 2.0, KubernetesExecutor Users need to set a pod_template_file as a base
value for all pods launched by the KubernetesExecutor
Problems:
1. Please create a pod_template_file by running `airflow generate_pod_template`.
This will generate a pod using your aiflow.cfg settings
...
Additionally you can use "upgrade config" to:
- specify rules you would like to ignore
- extend the check using custom rules
For example:
airflow upgrade_check --config=/files/upgrade.yaml
the configuration file should be a proper yaml file similar to this one:
ignored_rules:
- LegacyUIDeprecated
- ConnTypeIsNotNullableRule
- PodTemplateFileRule
custom_rules:
- path.to.upgrade_module.VeryCustomCheckClass
- path.to.upgrade_module.VeryCustomCheckClass2
Changelog
1.4.0
- Add
conf
not importable from airflow rule (#14400) - Upgrade rule to suggest rename
[scheduler] max_threads
to[scheduler] parsing_processes
(#14913) - Fix running "upgrade_check" command in a PTY. (#14977)
- Skip
DatabaseVersionCheckRule
check if invalid version is detected (#15122) - Fix too specific parsing of
False
inLegacyUIDeprecated
(#14967) - Fix false positives when inheriting classes that inherit
DbApiHook
(#16543)
1.3.0
- Fix wrong warning about class that was not used in a dag file (#14700)
- Fill DagBag from
dag_folder
setting for upgrade rules (#14588) - Bugfix: False positives for Custom Executors via Plugins check (#14680)
- Bugfix: Fix False alarm in import changes rule (#14493)
- Use
CustomSQLAInterface
instead ofSQLAInterface
(#14475) - Fix comparing airflow version to work with older versions of packaging library (#14435)
- Fix Incorrect warning in upgrade check and error in reading file (#14344)
- Handle possible suffix in MySQL version + avoid hard-coding (#14274)
1.2.0
- Add upgrade check option to list checks (#13392)
- Add clearer exception for read failures in macro plugin upgrade (#13371)
- Treat default value in
HostnameCallable
rule as good one (#13670) - Created
CustomExecutorsRequireFullPathRule
class (#13678) - Remove
UndefinedJinjaVariableRule
- Created rule for
SparkJDBCOperator
classconn_id
(#13798) - Created
DatabaseVersionCheckRule
class (#13955) - Add Version command for Upgrade Check (#12929)
- Use Tabular Format for the List of Upgrade Check Rules (#14139)
- Fix broken
airflow upgrade_check
command (#14137)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for apache-airflow-upgrade-check-1.4.0.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | 9a26c4e62ae42a42c4d8e537fdc977ff47741791da6a0ebcd54360bbba95b5ba |
|
MD5 | e39ebab93182fb0aabbd879fd7200130 |
|
BLAKE2b-256 | ae5a482651c9007eb33b0a008f14960717c7066568a68012ab81b277c9d36f70 |
Hashes for apache_airflow_upgrade_check-1.4.0-py2.py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 26ce6d55fc792a116729a008d2dcceac5afceaa9f11ba0a5b77d9bc22e130f77 |
|
MD5 | ee6237ae4c53ca26553c7df263910136 |
|
BLAKE2b-256 | 5975f1920610f176c227887f34a7e0bbb01b49e1ecc04a82452af8d52a3216b9 |