Back-ported airflow.providers.databricks.* package for Airflow 1.10.*
Project description
Package apache-airflow-backport-providers-databricks
Release: 2020.5.20
Backport package
This is a backport providers package for databricks
provider. All classes for this provider package
are in airflow.providers.databricks
python package.
Only Python 3.6+ is supported for this backport package.
While Airflow 1.10.* continues to support Python 2.7+ - you need to upgrade python to 3.6+ if you want to use this backport package.
Installation
You can install this package on top of an existing airflow 1.10.* installation via
pip install apache-airflow-backport-providers-databricks
Compatibility
For full compatibility and test status of the backport packages check Airflow Backport Package Compatibility
PIP requirements
PIP package | Version required |
---|---|
requests | >=2.20.0, <3 |
Provider class summary
All classes in Airflow 2.0 are in airflow.providers.databricks
package.
Operators
Moved operators
Airflow 2.0 operators: airflow.providers.databricks package |
Airflow 1.10.* previous location (usually airflow.contrib ) |
---|---|
operators.databricks.DatabricksRunNowOperator | contrib.operators.databricks_operator.DatabricksRunNowOperator |
operators.databricks.DatabricksSubmitRunOperator | contrib.operators.databricks_operator.DatabricksSubmitRunOperator |
Hooks
Moved hooks
Airflow 2.0 hooks: airflow.providers.databricks package |
Airflow 1.10.* previous location (usually airflow.contrib ) |
---|---|
hooks.databricks.DatabricksHook | contrib.hooks.databricks_hook.DatabricksHook |
Releases
Release 2020.5.20
Commit | Committed | Subject |
---|---|---|
12c5e5d8a | 2020-05-17 | Prepare release candidate for backport packages (#8891) |
f3521fb0e | 2020-05-16 | Regenerate readme files for backport package release (#8886) |
92585ca4c | 2020-05-15 | Added automated release notes generation for backport operators (#8807) |
649935e8c | 2020-04-27 | [AIRFLOW-8472]: PATCH for Databricks hook _do_api_call (#8473) |
16903ba3a | 2020-04-24 | [AIRFLOW-8474]: Adding possibility to get job_id from Databricks run (#8475) |
5648dfbc3 | 2020-03-23 | Add missing call to Super class in 'amazon', 'cloudant & 'databricks' providers (#7827) |
3320e432a | 2020-02-24 | [AIRFLOW-6817] Lazy-load airflow.DAG to keep user-facing API untouched (#7517) |
4d03e33c1 | 2020-02-22 | [AIRFLOW-6817] remove imports from airflow/__init__.py , replaced implicit imports with explicit imports, added entry to UPDATING.MD - squashed/rebased (#7456) |
97a429f9d | 2020-02-02 | [AIRFLOW-6714] Remove magic comments about UTF-8 (#7338) |
83c037873 | 2020-01-30 | [AIRFLOW-6674] Move example_dags in accordance with AIP-21 (#7287) |
c42a375e7 | 2020-01-27 | [AIRFLOW-6644][AIP-21] Move service classes to providers package (#7265) |
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for apache-airflow-backport-providers-databricks-2020.5.20rc2.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | 18308ccefda285db27f80757b0cd284bf0f733334a7273c04933969fd97c19b0 |
|
MD5 | 47b39c1517d8640835de50c7a83a2a93 |
|
BLAKE2b-256 | d55a4dd76ce4c0c6d5e30cbd6ca27d212fb9ee71ce4c189ac6c5779cff8227f5 |
Hashes for apache_airflow_backport_providers_databricks-2020.5.20rc2-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | d3122003fb83041a824b06b0a4387ed84db948d47f22632efcbbd9876ac48fcf |
|
MD5 | 9c4b3a22cdd08496931eaed08208eb34 |
|
BLAKE2b-256 | 4db07877e3b6c91cfef57b58dfa749f55ac33982c0b8619336e4e9cf535ebafd |