Skip to main content

Provider for Apache Airflow. Implements apache-airflow-providers-snowflake package

Project description

Package apache-airflow-providers-snowflake

Release: 4.1.0

Snowflake

Provider package

This is a provider package for snowflake provider. All classes for this provider package are in airflow.providers.snowflake python package.

You can find package information and changelog for the provider in the documentation.

Installation

You can install this package on top of an existing Airflow 2 installation (see Requirements below for the minimum Airflow version supported) via pip install apache-airflow-providers-snowflake

The package supports the following python versions: 3.7,3.8,3.9,3.10

Requirements

PIP package

Version required

apache-airflow

>=2.4.0

apache-airflow-providers-common-sql

>=1.3.1

snowflake-connector-python

>=2.4.1

snowflake-sqlalchemy

>=1.1.0

Cross provider package dependencies

Those are dependencies that might be needed in order to use all the features of the package. You need to install the specified provider packages in order to use them.

You can install such cross-provider dependencies when installing from PyPI. For example:

pip install apache-airflow-providers-snowflake[common.sql]

Dependent package

Extra

apache-airflow-providers-common-sql

common.sql

apache-airflow-providers-slack

slack

Changelog

4.1.0

Misc

  • Bump minimum Airflow version in providers (#30917)

4.0.5

Misc

  • Update documentation for snowflake provider 4.0 breaking change (#30020)

4.0.4

Bug Fixes

  • Fix missing parens for files parameter (#29437)

4.0.3

Bug Fixes

  • provide missing connection to the parent class operator (#29211)

  • Snowflake Provider - hide host from UI (#29208)

4.0.2

Breaking changes

The SnowflakeHook is now conforming to the same semantics as all the other DBApiHook implementations and returns the same kind of response in its run method. Previously (pre 4.* versions of the provider, the Hook returned Dictionary of { "column": "value" ... } which was not compatible with other DBApiHooks that return just sequence of sequences. After this change (and dependency on common.sql >= 1.3.1),the SnowflakeHook returns now python DbApi-compatible “results” by default.

The description (i.e. among others names and types of columns returned) can be retrieved via descriptions and last_description fields of the hook after run method completes.

That makes the DatabricksSqlHook suitable for generic SQL operator and detailed lineage analysis.

If you had custom hooks or used the Hook in your TaskFlow code or custom operators that relied on this behaviour, you need to adapt your DAGs or you can switch back the SnowflakeHook to return dictionaries by passing return_dictionaries=True to the run method of the hook.

The SnowflakeOperator is also more standard and derives from common SQLExecuteQueryOperator and uses more consistent approach to process output when SQL queries are run. However in this case the result returned by execute method is unchanged (it still returns Dictionaries rather than sequences and those dictionaries are pushed to XCom, so your DAGs relying on this behaviour should continue working without any change.

UPDATE: One of the unmentioned, breaking changes in the operator in 4.0 line was to switch autocommit to False by default. While not very friendly to the users, it was a side effect of unifying the interface with other SQL operators and we released it to the users, so switching it back again would cause even more confusion. You should manually add autocommit=True to your SnowflakeOperator if you want to continue using it and expect autocommit to work, but even better, you should switch to SQLExecuteQueryOperator.

In SnowflakeHook, if both extra__snowflake__foo and foo existed in connection extra dict, the prefixed version would be used; now, the non-prefixed version will be preferred.

The 4.0.0 and 4.0.1 versions have been broken and yanked, so the 4.0.2 is the first change from the 4.* line that should be used.

  • Fix wrapping of run() method result of exasol and snowflake DB hooks (#27997)

  • Make Snowflake Hook conform to semantics of DBApi (#28006)

4.0.1

Bug Fixes

  • Fix errors in Databricks SQL operator introduced when refactoring (#27854)

  • Bump common.sql provider to 1.3.1 (#27888)

  • Fixing the behaviours of SQL Hooks and Operators finally (#27912)

4.0.0

  • Update snowflake hook to not use extra prefix (#26764)

Misc

  • Move min airflow version to 2.3.0 for all providers (#27196)

Features

  • Add SQLExecuteQueryOperator (#25717)

Bug fixes

  • Use unused SQLCheckOperator.parameters in SQLCheckOperator.execute. (#27599)

3.3.0

Features

  • Add custom handler param in SnowflakeOperator (#25983)

Bug Fixes

  • Fix wrong deprecation warning for 'S3ToSnowflakeOperator' (#26047)

3.2.0

Features

  • Move all "old" SQL operators to common.sql providers (#25350)

  • Unify DbApiHook.run() method with the methods which override it (#23971)

3.1.0

Features

  • Adding generic 'SqlToSlackOperator' (#24663)

  • Move all SQL classes to common-sql provider (#24836)

  • Pattern parameter in S3ToSnowflakeOperator (#24571)

Bug Fixes

  • S3ToSnowflakeOperator: escape single quote in s3_keys (#24607)

3.0.0

Breaking changes

Bug Fixes

  • Fix error when SnowflakeHook take empty list in 'sql' param (#23767)

2.7.0

Features

  • Allow multiline text in private key field for Snowflake (#23066)

2.6.0

Features

  • Add support for private key in connection for Snowflake (#22266)

Bug Fixes

  • Fix mistakenly added install_requires for all providers (#22382)

2.5.2

Misc

  • Remove Snowflake limits (#22181)

2.5.1

Misc

  • Support for Python 3.10

2.5.0

Features

  • Add more SQL template fields renderers (#21237)

Bug Fixes

  • Fix #21096: Support boolean in extra__snowflake__insecure_mode (#21155)

2.4.0

Features

  • Support insecure mode in SnowflakeHook (#20106)

  • Remove unused code in SnowflakeHook (#20107)

  • Improvements for 'SnowflakeHook.get_sqlalchemy_engine' (#20509)

  • Exclude snowflake-sqlalchemy v1.2.5 (#20245)

  • Limit Snowflake connector to <2.7.2 (#20395)

2.3.1

Bug Fixes

  • Remove duplicate get_connection in SnowflakeHook (#19543)

2.3.0

Features

  • Add test_connection method for Snowflake Hook (#19041)

  • Add region to Snowflake URI. (#18650)

2.2.0

Features

  • Add Snowflake operators based on SQL Checks (#17741)

2.1.1

Misc

  • Optimise connection importing for Airflow 2.2.0

2.1.0

Features

  • Adding: Snowflake Role in snowflake provider hook (#16735)

2.0.0

Breaking changes

  • Auto-apply apply_default decorator (#15667)

Features

  • Add 'template_fields' to 'S3ToSnowflake' operator (#15926)

  • Allow S3ToSnowflakeOperator to omit schema (#15817)

  • Added ability for Snowflake to attribute usage to Airflow by adding an application parameter (#16420)

Bug Fixes

  • fix: restore parameters support when sql passed to SnowflakeHook as str (#16102)

1.3.0

Features

  • Expose snowflake query_id in snowflake hook and operator (#15533)

1.2.0

Features

  • Add dynamic fields to snowflake connection (#14724)

1.1.1

Bug fixes

  • Corrections in docs and tools after releasing provider RCs (#14082)

  • Prepare to release the next wave of providers: (#14487)

1.1.0

Updated documentation and readme files.

Features

  • Fix S3ToSnowflakeOperator to support uploading all files in the specified stage (#12505)

  • Add connection arguments in S3ToSnowflakeOperator (#12564)

1.0.0

Initial version of the provider.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

apache-airflow-providers-snowflake-4.1.0.tar.gz (30.0 kB view hashes)

Uploaded Source

Built Distribution

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page