Skip to main content

Provider for Apache Airflow. Implements apache-airflow-providers-jdbc package

Project description

Package apache-airflow-providers-jdbc

Release: 3.3.0rc1

Java Database Connectivity (JDBC)

Provider package

This is a provider package for jdbc provider. All classes for this provider package are in airflow.providers.jdbc python package.

You can find package information and changelog for the provider in the documentation.

Installation

You can install this package on top of an existing Airflow 2 installation (see Requirements below for the minimum Airflow version supported) via pip install apache-airflow-providers-jdbc

The package supports the following python versions: 3.7,3.8,3.9,3.10

Requirements

PIP package

Version required

apache-airflow

>=2.3.0

apache-airflow-providers-common-sql

>=1.3.0

jaydebeapi

>=1.1.1

Cross provider package dependencies

Those are dependencies that might be needed in order to use all the features of the package. You need to install the specified provider packages in order to use them.

You can install such cross-provider dependencies when installing from PyPI. For example:

pip install apache-airflow-providers-jdbc[common.sql]

Dependent package

Extra

apache-airflow-providers-common-sql

common.sql

Changelog

3.3.0

This release of provider is only available for Airflow 2.3+ as explained in the Apache Airflow providers support policy.

Misc

  • Move min airflow version to 2.3.0 for all providers (#27196)

  • Allow and prefer non-prefixed extra fields for JdbcHook (#27044)

Features

  • Add SQLExecuteQueryOperator (#25717)

  • Look for 'extra__' instead of 'extra_' in 'get_field' (#27489)

4.0.0

Breaking changes

Misc

  • In JdbcHook, non-prefixed extra fields are supported and are preferred. E.g. drv_path will be preferred if extra__jdbc__drv_path is also present.

3.2.1

Misc

  • Add common-sql lower bound for common-sql (#25789)

3.2.0

Features

  • Adding configurable fetch_all_handler for JdbcOperator (#25412)

  • Unify DbApiHook.run() method with the methods which override it (#23971)

3.1.0

Features

  • Move all SQL classes to common-sql provider (#24836)

3.0.0

Breaking changes

Features

  • Handler parameter from 'JdbcOperator' to 'JdbcHook.run' (#23817)

2.1.3

Bug Fixes

  • Fix mistakenly added install_requires for all providers (#22382)

2.1.2

Misc

  • Add Trove classifiers in PyPI (Framework :: Apache Airflow :: Provider)

2.1.1

Misc

  • Support for Python 3.10

2.1.0

Features

  • Add more SQL template fields renderers (#21237)

  • Add optional features in providers. (#21074)

2.0.1

Bug Fixes

  • Fix type annotations in OracleOperator, JdbcOperator, SqliteOperator (#17406)

Misc

  • Optimise connection importing for Airflow 2.2.0

2.0.0

Breaking changes

  • Auto-apply apply_default decorator (#15667)

1.0.1

Updated documentation and readme files.

1.0.0

Initial version of the provider.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

apache-airflow-providers-jdbc-3.3.0rc1.tar.gz (12.7 kB view details)

Uploaded Source

Built Distribution

File details

Details for the file apache-airflow-providers-jdbc-3.3.0rc1.tar.gz.

File metadata

File hashes

Hashes for apache-airflow-providers-jdbc-3.3.0rc1.tar.gz
Algorithm Hash digest
SHA256 5c568101a2276e0447d0b52c1ed4eecba6596f887c2505d9cfeeaa6ea0a8c5ee
MD5 cf25e4ae97c9cc222eaf77b6aba7b57e
BLAKE2b-256 2723d88748820532b28d320ce59813f7b7a3911762fc78725476a9553af9b5e0

See more details on using hashes here.

File details

Details for the file apache_airflow_providers_jdbc-3.3.0rc1-py3-none-any.whl.

File metadata

File hashes

Hashes for apache_airflow_providers_jdbc-3.3.0rc1-py3-none-any.whl
Algorithm Hash digest
SHA256 b418748c3574ff379df47ec8b2bd4e0017e0bdf9fb71d6bea2a8a8271c20b158
MD5 5df61958d7526708a2b6b53397eb29e1
BLAKE2b-256 6d0b3597db09fc9ede265908261f8312d1ab839c22b6b354a06683a2329e1e65

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page