Skip to main content

A helpful package for making snowpark code easier to write and ead

Project description

Snowpark-Utilities

Downloads

Description

Snowpark Utilities is a set of Python tools aimed at easing much of the repetitive work around using Snowflake's Snowpark API. these tools aim make it easier to stand up new snowpark sessions or execute sql commands especialy in environments where multiple sessions are needed. The module contains functionality for users who want to directly feed credentails for authentication or those who are working with a tool like AWS secrets where credentils might be stored. The aim of this project is to make it faster and cleaner to stand up new snowpark projects without copy and pasting code from similar endeavors or combing through documentation.

fetch_credentials_from_secrets(secret_name, aws_access_key_id, aws_secret_access_key, region_name)

This function takes an AWS secret name as an input and returns the credentials in a dictionary format where they can be queried either for use in create_snowpark_session() or for other uses

from snowpark_utilities import snowpark_utilitie as spu

credentials = spu.fetch_credentials_from_secrets("secret_name", "aws_access_key_id", "aws_secret_access_key", "region_name")

create_snowpark_session(username, password, account, role, warehouse)

This function takes in the above five required inputs and returns a "session" variable which can be used for snowpark operations. if you were to, for example, have a "parent" and "child" snowflake account and needed sessions for both, you could run the following:

from snowpark_utilities import snowpark_utilities as spu
parent_session = spu.create_snowpark_session('username', 'password', 'account', 'role', 'warehouse')
child_session = spu.create_snowpark_session('username', 'password', 'account', 'role', 'warehouse')

now it's simple to differentiate execution between the two accounts

aws_create_snowpark_session(secret_name, aws_access_key_id, aws_secret_access_key, region_name, role = "ACCOUNTADMIN", warehouse = "COMPUTE_WH")

This function is a version of create_snowpark_session() made explicitely for use with AWS Secrets. simply feed it the appropriate secret name and as long as the username is filed under the key name "username" password under "password" and account under "account" it will return a session. If you don't have this naming schema and still want to use secrets, it's simple to modify this function or fetch the credentials using fetch_credentials_from_secrets and parse the dictionary yourself. Here's an easy example of how you might use this

from snowpark_utilities import snowpark_utilitie as spu

credentials = spu.fetch_credentials_from_secrets("secret_name", "aws_access_key_id", "aws_secret_access_key", "region_name", role = "ACCOUNTADMIN", warehouse = "COMPUTE_WH")

execute_sql(session, command)

An annoyance I've had with snowpark in terms of ease of use and code readability is that defining code and executing code are two distinctly different operations. you can always define a piece of SQL code for operation using session.sql("sql code") but executing requires a .collect() at the end of this line. This command takes in the given session and desired command and executes it all at once. if you wish to do anything with .to_pandas() you will still need to define that manualy but this works great for anything else and the function returns the .collect() so you could also run the function within a pd.Dataframe()

spu.execute_sql("USE ROLE DATA_ANALYST")

execute_sql_pandas(session, command)

created command that given a SELECT command returns a pandas dataframe

dataframe = spu.execute_sql_pandas("SELECT * FROM DWLOAD.DWSTAGE.TABLE1")

snowflake2snowflakevalidation(session_source, session_target, database)

In the case of database migration, it can be time consuming to ensure all tables were successfully migrated. this function takes in a source and target snowpark session along with the database in question and returns a dataframe of all tables from the source and the associated row count between the two tables.

from snowpark_utilities import snowpark_utilities as spu
parent_session = snowpark_util.create_snowpark_session('username', 'password', 'account', 'role', 'warehouse')
child_session = snowpark_util.create_snowpark_session('username', 'password', 'account', 'role', 'warehouse')
spu.snowflake2snowflakevalidation(parent_session, child_session, "DWLOAD")

create_table_statement(database, schema, table, df, uppercase = False, varchar = False)

In the case of database migration, you usually need to stand up all target tables prior to staging data. On large engagements where there can be upwards of 1000 tables to migrate having an automated method to generate these create table statements is crucial. This function takes inputs around the table info alongside a dataframe of that table and generates a create table statment. there are optional parameters for forcing uppercase column names or setting all data types to VARCHAR for raw layers. just loop through a list of 1 row table examples from a legacy source and concatenate the outputs into one .sql file for execution.

from snowpark_utilities import snowpark_utilities as spu
df = pd.read_csv('SampleFile2019.csv')
foo = spu.create_table_statement(database = 'foo', schema = 'bar', table = 'man', df)
print(foo)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

snowpark_utilities-0.1.41.tar.gz (6.9 kB view details)

Uploaded Source

File details

Details for the file snowpark_utilities-0.1.41.tar.gz.

File metadata

  • Download URL: snowpark_utilities-0.1.41.tar.gz
  • Upload date:
  • Size: 6.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.4

File hashes

Hashes for snowpark_utilities-0.1.41.tar.gz
Algorithm Hash digest
SHA256 c351e132c9036e2abc28e32c5d0cf7bfdbf560a36c126e05fdc2dd77e9dddf82
MD5 66801ed269f3399ba5d57ee961f4d1ae
BLAKE2b-256 668f0aed4a459818d01211a9fe915e79159ecd1eeb86146511317beecf133392

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page