Mobilize.Net Database Deploy tool for Snowflake
Project description
sc-deploy-db is a multi-platform command line tool for deploying scripts to Snowflake. For large data warehouses, it makes it easy when you have folders with a lot of source files and you need a quick solution to deploy them to your Snowflake Warehouse.
Installation
$ pip install snowconvert-deploy-tool --upgrade
Usage
For information about the different parameters or options just run it using the -h option:
$ sc-deploy-db -h
usage: sc-deploy-db [-h] [-A ACCOUNT] [-D DATABASE] [-WH WAREHOUSE] [-R ROLE] [-U USER] [-P PASSWORD] [-W WORKSPACE] -I INPATH
[--activeConn ACTIVECONN] [--authenticator AUTHENTICATOR] [-L LOGPATH] [--SplitBefore SPLITBEFORE] [--SplitAfter SPLITAFTER]
[--ObjectType [OBJECTTYPE]]
SnowConvertStudio Deployment Script
===================================
This script helps you to deploy a collection of .sql files to a Snowflake Account.
The tool will look for settings like:
- Snowflake Account
- Snowflake Warehouse
- Snowflake Role
- Snowflake Database
If the tool can find a config_snowsql.ini file in the current directory or in the workspace\config_snowsql.ini location
it will read those parameters from there.
optional arguments:
-h, --help show this help message and exit
-A ACCOUNT, --Account ACCOUNT
Snowflake Account
-D DATABASE, --Database DATABASE
Snowflake Database
-WH WAREHOUSE, --Warehouse WAREHOUSE
Snowflake Warehouse
-R ROLE, --Role ROLE Snowflake Role
-U USER, --User USER Snowflake User
-P PASSWORD, --Password PASSWORD
Password
-W WORKSPACE, --Workspace WORKSPACE
Path for workspace root. Defaults to current dir
-I INPATH, --InPath INPATH
Path for SQL scripts
--activeConn ACTIVECONN
When given, it will be used to select connection parameters forn config_snowsql.ini
--authenticator AUTHENTICATOR
Use the authenticator with you want to use a different authentication mechanism
-L LOGPATH, --LogPath LOGPATH
Path for process logs. Defaults to current dir
--SplitBefore SPLITBEFORE
Regular expression that can be used to split code in fragments starting **BEFORE** the matching expression
--SplitAfter SPLITAFTER
Regular expression that can be used to split code in fragments starting **AFTER** the matching expression
--ObjectType [OBJECTTYPE]
Object Type to deploy table,view,procedure,function,macro
This tool assumes :
that you have a collection of .sql files under a directory. It will then execute all those .sql files connecting to the specified database.
that each file contains only one statement.
The tool can also read its values from environment variables. The following environment variables are recognized by this tool:
Variable Name |
Description |
---|---|
SNOW_USER |
The username that will be used for the connection |
SNOW_PASSWORD |
The password that will be used for the connection |
SNOW_ROLE |
The snowflake role that will used for the connection |
SNOW_ACCOUNT |
The snowflake accountname that will used for the connection |
SNOW_WAREHOUSE |
The warehouse to use when running the sql |
SNOW_DATABASE |
The database to use when running the sql |
Examples
If you have a folder structure like:
+ code + procs proc1.sql + tables table1.sql + folder1 table2.sql
You can deploy then by running:
sc-deploy-db -A my_sf_account -WH my_wh -U user -P password -I code
If you want to use another authentication like Azure AD you can do:
sc-deploy-db -A my_sf_account -WH my_wh -U user -I code --authenticator externalbrowser
A recommended approach is that you setup a bash shell script, for example config.sh with contents like:
export SNOW_ACCOUNT="migration.us-east-1" export SNOW_WAREHOUSE="TIAA_WH" export SNOW_ROLE="TIAA_FULL_ROLE" export SNOW_DATABASE="TIAA" echo "Reading User and Password. When you type values wont be displayed" read -s -p "User: " SNOW_USER echo "" read -s -p "Password: " SNOW_PASSWORD echo "" export SNOW_USER export SNOW_PASSWORD
You can then run the script like: source config.sh. After that you can just run sc-deploy-db -I folder-to-deploy
Files with multiple statements
If your files have multiple statements, it will cause some failures are the snowflake Python API does not allow multiple statements on a single call. In order to handle that, you give a tool a this pattern is a regular expression that can be used to split the file contents before sending them to the database. This pattern could be used to split before the pattern: –SplitBefore or to split after the pattern –SplitAfter.
Let’s see some example.
If you have a file with contents like:
CREATE OR REPLACE SEQUENCE SEQ1 START WITH 1 INCREMENT BY 1; /* <sc-table> TABLE1 </sc-table> */ CREATE TABLE TABLE1 ( COL1 VARCHAR );
You can use an argument like –SplitAfter ‘;’ that will create a fragment from the file anytime a ; is found.
If you have a file with statements like:
CREATE TABLE OR REPLACE TABLE1 ( COL1 VARCHAR ); /* <sc-table> TABLE2 </sc-table> */ CREATE TABLE TABLE2 ( COL1 VARCHAR );
You can use an argument like –SplitBefore ‘CREATE (OR REPLACE)?’. That will create a fragment each time a CREATE or CREATE OR REPLACE fragment is found;
Reporting issues and feedback
If you encounter any bugs with the tool please file an issue in the Issues section of our GitHub repo.
License
sc-deploy-db is licensed under the MIT license.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for snowconvert-deploy-tool-0.0.7.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | 5dc9a240e1eebba44d1a9c47e822def22f6af8d66e6a90591f062d5c03fcd6fc |
|
MD5 | bef15b3f9f59fa01de26f1e587d265d6 |
|
BLAKE2b-256 | ab2dcffe90c43bb875369d634f07a27d10d693ebb93be6db143621ecee3ac287 |
Hashes for snowconvert_deploy_tool-0.0.7-py2.py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 4a81fa820816914cc93a30a721ae1d4d2e05fbc083e2f70656ea53519ca70764 |
|
MD5 | abba8cb9d0a07c0a127c9d93aa57e62b |
|
BLAKE2b-256 | 716c66f73909bc4db13dff5eb6f6c11d13b52f05064cabab77f72404c04f9a75 |