Skip to main content

Provider for Apache Airflow. Implements apache-airflow-providers-databricks package

Project description

Package apache-airflow-providers-databricks

Release: 3.3.0rc1

Databricks

Provider package

This is a provider package for databricks provider. All classes for this provider package are in airflow.providers.databricks python package.

You can find package information and changelog for the provider in the documentation.

Installation

You can install this package on top of an existing Airflow 2 installation (see Requirements below for the minimum Airflow version supported) via pip install apache-airflow-providers-databricks

The package supports the following python versions: 3.7,3.8,3.9,3.10

Requirements

PIP package

Version required

apache-airflow

>=2.2.0

apache-airflow-providers-common-sql

>=1.2.0

requests

>=2.27,<3

databricks-sql-connector

>=2.0.0, <3.0.0

aiohttp

>=3.6.3, <4

Cross provider package dependencies

Those are dependencies that might be needed in order to use all the features of the package. You need to install the specified provider packages in order to use them.

You can install such cross-provider dependencies when installing from PyPI. For example:

pip install apache-airflow-providers-databricks[common.sql]

Dependent package

Extra

apache-airflow-providers-common-sql

common.sql

Changelog

3.3.0

Features

  • DatabricksSubmitRunOperator dbt task support (#25623)

Misc

  • Add common-sql lower bound for common-sql (#25789)

  • Remove duplicated connection-type within the provider (#26628)

Bug Fixes

  • Databricks: fix provider name in the User-Agent string (#25873)

3.2.0

Features

  • Databricks: update user-agent string (#25578)

  • More improvements in the Databricks operators (#25260)

  • Improved telemetry for Databricks provider (#25115)

  • Unify DbApiHook.run() method with the methods which override it (#23971)

Bug Fixes

  • Databricks: fix test_connection implementation (#25114)

  • Do not convert boolean values to string in deep_string_coerce function (#25394)

  • Correctly handle output of the failed tasks (#25427)

  • Databricks: Fix provider for Airflow 2.2.x (#25674)

3.1.0

Features

  • Added databricks_conn_id as templated field (#24945)

  • Add 'test_connection' method to Databricks hook (#24617)

  • Move all SQL classes to common-sql provider (#24836)

Bug Fixes

  • Update providers to use functools compat for ''cached_property'' (#24582)

3.0.0

Breaking changes

Features

  • Add Deferrable Databricks operators (#19736)

  • Add git_source to DatabricksSubmitRunOperator (#23620)

Bug Fixes

  • fix: DatabricksSubmitRunOperator and DatabricksRunNowOperator cannot define .json as template_ext (#23622) (#23641)

  • Fix UnboundLocalError when sql is empty list in DatabricksSqlHook (#23815)

2.7.0

Features

  • Update to the released version of DBSQL connector

  • DatabricksSqlOperator - switch to databricks-sql-connector 2.x

  • Further improvement of Databricks Jobs operators (#23199)

2.6.0

Features

  • More operators for Databricks Repos (#22422)

  • Add a link to Databricks Job Run (#22541)

  • Databricks SQL operators are now Python 3.10 compatible (#22886)

Bug Fixes

  • Databricks: Correctly handle HTTP exception (#22885)

Misc

  • Refactor 'DatabricksJobRunLink' to not create ad hoc TaskInstances (#22571)

2.5.0

Features

  • Operator for updating Databricks Repos (#22278)

Bug Fixes

  • Fix mistakenly added install_requires for all providers (#22382)

2.4.0

Features

  • Add new options to DatabricksCopyIntoOperator (#22076)

  • Databricks hook - retry on HTTP Status 429 as well (#21852)

Misc

  • Skip some tests for Databricks from running on Python 3.10 (#22221)

2.3.0

Features

  • Add-showing-runtime-error-feature-to-DatabricksSubmitRunOperator (#21709)

  • Databricks: add support for triggering jobs by name (#21663)

  • Added template_ext = ('.json') to databricks operators #18925 (#21530)

  • Databricks SQL operators (#21363)

Bug Fixes

  • Fixed changelog for January 2022 (delayed) provider's release (#21439)

Misc

  • Support for Python 3.10

  • Updated Databricks docs for correct jobs 2.1 API and links (#21494)

2.2.0

Features

  • Add 'wait_for_termination' argument for Databricks Operators (#20536)

  • Update connection object to ''cached_property'' in ''DatabricksHook'' (#20526)

  • Remove 'host' as an instance attr in 'DatabricksHook' (#20540)

  • Databricks: fix verification of Managed Identity (#20550)

2.1.0

Features

  • Databricks: add more methods to represent run state information (#19723)

  • Databricks - allow Azure SP authentication on other Azure clouds (#19722)

  • Databricks: allow to specify PAT in Password field (#19585)

  • Databricks jobs 2.1 (#19544)

  • Update Databricks API from 2.0 to 2.1 (#19412)

  • Authentication with AAD tokens in Databricks provider (#19335)

  • Update Databricks operators to match latest version of API 2.0 (#19443)

  • Remove db call from DatabricksHook.__init__() (#20180)

Bug Fixes

  • Fixup string concatenations (#19099)

  • Databricks hook: fix expiration time check (#20036)

2.0.2

Bug Fixes

  • Move DB call out of DatabricksHook.__init__ (#18339)

2.0.1

Misc

  • Optimise connection importing for Airflow 2.2.0

2.0.0

Breaking changes

  • Auto-apply apply_default decorator (#15667)

1.0.1

Updated documentation and readme files.

1.0.0

Initial version of the provider.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

Built Distribution

File details

Details for the file apache-airflow-providers-databricks-3.3.0rc1.tar.gz.

File metadata

File hashes

Hashes for apache-airflow-providers-databricks-3.3.0rc1.tar.gz
Algorithm Hash digest
SHA256 dbdf3331b024bd08794356881b0251f009c0c993bffb11f0cb6ce1da0f43c919
MD5 357ec5bf62d4e4b83d729b750ae81416
BLAKE2b-256 cff8934d462baebb6963bb099798d2da6ab212682c0f64387490286a4d364711

See more details on using hashes here.

File details

Details for the file apache_airflow_providers_databricks-3.3.0rc1-py3-none-any.whl.

File metadata

File hashes

Hashes for apache_airflow_providers_databricks-3.3.0rc1-py3-none-any.whl
Algorithm Hash digest
SHA256 2708e8a6ee7a253f54adcaae39670006e765f47ff89e08b5c308096d4ae6b7dd
MD5 2f7ba6e65993a34dede67389371a484e
BLAKE2b-256 fdd9c338569ea1d2d6c17dba8f1d0b204aecbafa1680497ea939b574ab5096f5

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page