Skip to main content

Provider for Apache Airflow. Implements apache-airflow-providers-odbc package

Project description

Package apache-airflow-providers-odbc

Release: 3.2.0rc1

ODBC

Provider package

This is a provider package for odbc provider. All classes for this provider package are in airflow.providers.odbc python package.

You can find package information and changelog for the provider in the documentation.

Installation

You can install this package on top of an existing Airflow 2 installation (see Requirements below for the minimum Airflow version supported) via pip install apache-airflow-providers-odbc

The package supports the following python versions: 3.7,3.8,3.9,3.10

Requirements

PIP package

Version required

apache-airflow

>=2.3.0

apache-airflow-providers-common-sql

>=1.3.0

pyodbc

Cross provider package dependencies

Those are dependencies that might be needed in order to use all the features of the package. You need to install the specified provider packages in order to use them.

You can install such cross-provider dependencies when installing from PyPI. For example:

pip install apache-airflow-providers-odbc[common.sql]

Dependent package

Extra

apache-airflow-providers-common-sql

common.sql

Changelog

3.2.0

This release of provider is only available for Airflow 2.3+ as explained in the Apache Airflow providers support policy.

Misc

  • Move min airflow version to 2.3.0 for all providers (#27196)

3.1.2

Misc

  • Add common-sql lower bound for common-sql (#25789)

3.1.1

Bug Fixes

  • Fix odbc hook sqlalchemy_scheme docstring (#25421)

3.1.0

Features

  • Move all SQL classes to common-sql provider (#24836)

3.0.0

Breaking changes

2.0.4

Bug Fixes

  • Fix mistakenly added install_requires for all providers (#22382)

2.0.3

Misc

  • Add Trove classifiers in PyPI (Framework :: Apache Airflow :: Provider)

2.0.2

Misc

  • Support for Python 3.10

2.0.1

Misc

  • Optimise connection importing for Airflow 2.2.0

2.0.0

Breaking changes

  • Auto-apply apply_default decorator (#15667)

  • OdbcHook returns None. Related to #15016 issue. (#15510)

    When you pass kwargs to the connection (for example autocommit and ansi) in connect_kwargs extra you should bass those as booleans. Previously strings were also supported.

"connect_kwargs": {
   "autocommit": "false",
   "ansi": "true"
}

should become

"connect_kwargs": {
   "autocommit": false,
   "ansi": true
}

Bug Fixes

  • Fix OdbcHook handling of port (#15772)

1.0.1

Updated documentation and readme files.

1.0.0

Initial version of the provider.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

apache-airflow-providers-odbc-3.2.0rc1.tar.gz (12.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file apache-airflow-providers-odbc-3.2.0rc1.tar.gz.

File metadata

File hashes

Hashes for apache-airflow-providers-odbc-3.2.0rc1.tar.gz
Algorithm Hash digest
SHA256 214cd49abb9a039848f60cb7dc1f95ebe23dbf2fe0a86a2dcde9487c79a5eeca
MD5 d8e587443484c3bbe5ca01c738787486
BLAKE2b-256 3a1a7b6393cb6b6df0525f30146cfa51261b66db81f605f3401512a9eca4cb15

See more details on using hashes here.

File details

Details for the file apache_airflow_providers_odbc-3.2.0rc1-py3-none-any.whl.

File metadata

File hashes

Hashes for apache_airflow_providers_odbc-3.2.0rc1-py3-none-any.whl
Algorithm Hash digest
SHA256 4f58b1c701c2875a7123a357e04d8b8370990c158b5c077e412e22083bdec082
MD5 4f18dc200402b2a3c6cbde4a14e619d1
BLAKE2b-256 05a9e49c429a8fb87a6fc913cef0ea54c1b1d64c53ef5e21d8d1d1ff23c058d5

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page