Skip to main content

This package contains utility functions for Prefect and Snowflake

Project description

orchestration-utilities

This repository holds the utilities modules that are essential for ETL operations. This repository will be used as a package and serve the ETL flows.
This package will be used in the PREFECT flows and SNOWFLAKE as part of the ETL operations.

Installation

Install the package using PYPI

pip install orchestration-utils

Inside this package

1. aws.py

This module contains the functions that are used to interact with the AWS services.
Example: S3


2. copy_into_s3

This module contains the functions that can be used to copy the data from the Snowflake Stage(S3 Bucket) to the Snowflake Table. This module leverages the etl_operations module to perform the Schema Drift Handeling and Query Execution.
This module works best with the Stages that are partitioned well. Example: The data in the S3 bucket is partitioned by date, year, month, etc.
This module does not perform well if the data is not partitioned well in the S3 bucket. Example: If the data in the S3 bucket is dropped under a single folder without any partitioning, then the copy operation will take a lot of time to complete. Given the folder is heavy with files.

Class/Groups:

  • CopyIntoTable: This class contains the functions that are used to copy the data from the Snowflake Stage(S3 Bucket) to the Snowflake Table.
  • copy_into_snowflake_table: This function is the main function that is used to copy the data from the Snowflake Stage(S3 Bucket) to the Snowflake Table. It accepts the parameter force which is used to force the copy operation to be performed even if the data is already present in the table. The default value of the force parameter is False.

3. etl_contol.py

This module contains the functions that interact with Snowflake and stores the states of the flows in the database.

  • This module accepts the connection(connection_creds) paramater where the default value is snowflake-prefect-user, pipeline name and environment name.
  • The pipeline name and environment name are used to store the states of the flows in the database. Example when the flow is started, completed, failed, etc.

4. etl_operations.py

This module contains the functions that are used to perform the ETL operations either in the Destination table or in the Source table.

Class/Groups:

  • CreateConnections: This class is used to create the connections to the databases. The connections are created using the connection credentials and warehouse name.
  • SnowflakeDestination: This class contains all the load types and the functions that are used to load the data into the Snowflake tables.
    This class accepts the connection credentials (by default the value is snowflake-prefect-user), warehouse name(by default the value is loading), database name, and environment name(by default the value is dev).
  • DataFrameHadler: This class contains the functions that converts the dataframes columns to the relevant data types.
  • SchemaDriftHandler: This class contains the functions that are used to handle the schema drifts in the destination table.
  • SnowflakeSource: This class contains the functions that are used to extract the data from the Snowflake tables.

5. notifications.py

This module contains the functions that are used to send the notifications to Slack. The Webhook blocks need to be created in Prefect first to send the notifications to Slack.

Class/Groups:

  • SlackWebhooksNotification: This class is used to send the notifications to Slack. The Class accepts the webhook name and the message that needs to be sent to Slack.

6. queries.py

This module contains the queries that are used to perform the ETL operations in the Snowflake tables. This module is referred by the etl_control and etl_operations modules.

How to locally build package

Install the dependencies in your virtual environment.

pip install -r requirements-dev.txt

Build dist floder where .whl and .tar.gz files are created

make build

This will create the dist folder where two files are created.

  • orchestration_utils-0.0.0.tar.gz
  • orchestration_utils-0.0.0-py3-none-any.whl

The .whl is the installation file that can be installed using the pip install dist/orchestration_utils-0.0.0-py3-none-any.whl command.

How to deploy

Deploy the package to the PYPI using Github Actions. There are two workflows one to deploy in dev and the other to deploy in production.

1. Dev/Manual Release to TestPyPI

  • Click on Run workflow
  • Select the branch that you have made the changes
  • The changes will be refelcted in TestPyPI

2. Prod Release to PyPI

  • Click on Run workflow
  • Select the main branch only
  • The changes will be refelcted in PyPI

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

orchestration_utils-0.0.9.tar.gz (21.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

orchestration_utils-0.0.9-py3-none-any.whl (17.7 kB view details)

Uploaded Python 3

File details

Details for the file orchestration_utils-0.0.9.tar.gz.

File metadata

  • Download URL: orchestration_utils-0.0.9.tar.gz
  • Upload date:
  • Size: 21.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for orchestration_utils-0.0.9.tar.gz
Algorithm Hash digest
SHA256 e7eeb85731cf7483a5f336a3fba805cbdea665bc329cf94aeb92bcd5158aea1c
MD5 a7577ab8c03809e42d15348f42d9c2b6
BLAKE2b-256 2320790065845e27921b8589a2f1176fb81698d7df732bb6cbeb3bedce3c4f7d

See more details on using hashes here.

Provenance

The following attestation bundles were made for orchestration_utils-0.0.9.tar.gz:

Publisher: prod-release.yml on cloudfactory/orchestration-utilities

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file orchestration_utils-0.0.9-py3-none-any.whl.

File metadata

File hashes

Hashes for orchestration_utils-0.0.9-py3-none-any.whl
Algorithm Hash digest
SHA256 70555e117ca6ea772a508a016804697bf118de5dde4dd0dcdfca462ae43a6ec2
MD5 a6a5296f1bdf561d3680eadf15f3fe6c
BLAKE2b-256 6fbb6124a20dbdedde7ed064e7bb4e0d07ee917cc51b82720da345f34fe6d5a9

See more details on using hashes here.

Provenance

The following attestation bundles were made for orchestration_utils-0.0.9-py3-none-any.whl:

Publisher: prod-release.yml on cloudfactory/orchestration-utilities

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page