Skip to main content

Check for compatibility between Airflow versions

Project description

Apache Airflow Upgrade Check

PyPI version License PyPI - Python Version PyPI - Downloads Twitter Follow Slack Status

This package aims to easy the upgrade journey from Apache Airflow 1.10 to 2.0.

While we have put a lot of effort in to making this upgrade as painless as possible, with many changes providing upgrade path (where the old code continues to work and prints out a deprecation warning) there were unfortunately some breaking changes where we couldn't provide a compatibility shim.

The recommended upgrade path to get to Airflow 2.0.0 is to first upgrade to the latest release in the 1.10 series (at the time of writing: 1.10.15) and to then run this script.

pip install apache-airflow-upgrade-check
airflow upgrade_check

This will then print out a number of action items that you should follow before upgrading to 2.0.0 or above.

The exit code of the command will be 0 (success) if no problems are reported, or 1 otherwise.

For example:

============================================= STATUS =============================================

Check for latest versions of apache-airflow and checker.................................SUCCESS
Legacy UI is deprecated by default......................................................SUCCESS
Users must set a kubernetes.pod_template_file value.....................................FAIL
Changes in import paths of hooks, operators, sensors and others.........................FAIL
Remove airflow.AirflowMacroPlugin class.................................................SUCCESS
Check versions of PostgreSQL, MySQL, and SQLite to ease upgrade to Airflow 2.0..........SUCCESS
Fernet is enabled by default............................................................FAIL
Logging configuration has been moved to new section.....................................SUCCESS
Connection.conn_id is not unique........................................................SUCCESS
GCP service account key deprecation.....................................................SUCCESS
Users must delete deprecated configs for KubernetesExecutor.............................FAIL
Changes in import path of remote task handlers..........................................SUCCESS
Chain between DAG and operator not allowed..............................................SUCCESS
SendGrid email uses old airflow.contrib module..........................................SUCCESS
Connection.conn_type is not nullable....................................................SUCCESS
Found 16 problems.

======================================== RECOMMENDATIONS =========================================

Users must set a kubernetes.pod_template_file value
---------------------------------------------------
In Airflow 2.0, KubernetesExecutor Users need to set a pod_template_file as a base
value for all pods launched by the KubernetesExecutor


Problems:

  1.  Please create a pod_template_file by running `airflow generate_pod_template`.
This will generate a pod using your aiflow.cfg settings

...

Additionally you can use "upgrade config" to:

  • specify rules you would like to ignore
  • extend the check using custom rules

For example:

airflow upgrade_check --config=/files/upgrade.yaml

the configuration file should be a proper yaml file similar to this one:

ignored_rules:
  - LegacyUIDeprecated
  - ConnTypeIsNotNullableRule
  - PodTemplateFileRule

custom_rules:
  - path.to.upgrade_module.VeryCustomCheckClass
  - path.to.upgrade_module.VeryCustomCheckClass2

Changelog

1.4.0

  • Add conf not importable from airflow rule (#14400)
  • Upgrade rule to suggest rename [scheduler] max_threads to [scheduler] parsing_processes (#14913)
  • Fix running "upgrade_check" command in a PTY. (#14977)
  • Skip DatabaseVersionCheckRule check if invalid version is detected (#15122)
  • Fix too specific parsing of False in LegacyUIDeprecated (#14967)
  • Fix false positives when inheriting classes that inherit DbApiHook (#16543)

1.3.0

  • Fix wrong warning about class that was not used in a dag file (#14700)
  • Fill DagBag from dag_folder setting for upgrade rules (#14588)
  • Bugfix: False positives for Custom Executors via Plugins check (#14680)
  • Bugfix: Fix False alarm in import changes rule (#14493)
  • Use CustomSQLAInterface instead of SQLAInterface (#14475)
  • Fix comparing airflow version to work with older versions of packaging library (#14435)
  • Fix Incorrect warning in upgrade check and error in reading file (#14344)
  • Handle possible suffix in MySQL version + avoid hard-coding (#14274)

1.2.0

  • Add upgrade check option to list checks (#13392)
  • Add clearer exception for read failures in macro plugin upgrade (#13371)
  • Treat default value in HostnameCallable rule as good one (#13670)
  • Created CustomExecutorsRequireFullPathRule class (#13678)
  • Remove UndefinedJinjaVariableRule
  • Created rule for SparkJDBCOperator class conn_id (#13798)
  • Created DatabaseVersionCheckRule class (#13955)
  • Add Version command for Upgrade Check (#12929)
  • Use Tabular Format for the List of Upgrade Check Rules (#14139)
  • Fix broken airflow upgrade_check command (#14137)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

apache-airflow-upgrade-check-1.4.0rc2.tar.gz (40.9 kB view hashes)

Uploaded Source

Built Distribution

apache_airflow_upgrade_check-1.4.0rc2-py2.py3-none-any.whl (58.1 kB view hashes)

Uploaded Python 2 Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page