A package to execute Copy and Stored Procedure tasks on data
Project description
DataPipelineExecutor
This package contains a pipeline which executes two tasks: Copying data from source (Kusto Cluster DB) to the sink (SQL Sever DB), and Executing a simple SQL Stored Procedure on the copied data. These tasks are completely configurable and the pipeline has following key features:
- Restartability with configurability.
- Logging - on top of logging on console, you can pass a dedicated filepath as a parameter if you wish to store the logs. Otherwise a logger.log file will be created in the same directory and all logs will be stored there until deleted.
Installation
Use pip to install the package:
pip install DataPipelineExecutor
Run the following commands to install dependencies:
pip install numpy
pip install pandas
pip install azure-kysto-data
pip install azure-kusto-ingest
pip install pyodbc
pip install papermill
Usage
The to execute the pipeline run the following:
import main from DataPipelineExecutor
main('config.txt', 'logger.log')
# OR
main('config.txt')
Note that logger file is optional and if no paramater is passed it will automatically be created in the same directory.
Configuration
Three config files are required for this pipeline to run:
- config: This contains parameters (refer to the format below) needed for task executions.
- source_config: This file contains parameters needed to establish connection to the source server. The path to this file is passed as a parameter in the primary_config file.
- sink_config: This file contains parameters needed to establish connection to the sink server. The path to this file is passed as a parameter in the primary_config file.
Format for the primary_config:
[Watermark]
Table_Name =
Column1 =
Column2 =
watermark_col_basis_name =
DateTime = 1900-01-01
[CopyData]
SourceType =
SinkType =
SourceConfig =
SinkConfig =
SQLTable =
KustoTable =
Query =
BatchSize =
[TaskSet]
Sequence =
[StoredProcedure]
ProcedureName =
ParameterName =
ParameterType =
TargetColumn =
TargetValue =
[Notebook]
Path =
OutputPath =
Param1 =
Param2 =
Format for source_config:
[Kusto]
Cluster =
Database =
ClientID =
ClientSecret =
AuthorityID =
Fornmat for sink_config:
[SQL]
Server =
Database =
Username =
Password =
Driver =
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for DataPipelineExecutor-0.0.3.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | ec53a463752228045842ad9623f79f39e7bda7cd3147f7fe9568ccde8f1f0dcd |
|
MD5 | a35f4e61db61c521a6f26b1ef121bcda |
|
BLAKE2b-256 | 1c6d32132373f27325ca4444540a4d280147b8c44b48700ddec62b661911657c |
Hashes for DataPipelineExecutor-0.0.3-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | b91c08678d39b6accb6613756607c6c19e1c8ce40ae24d6c6533cb267a21ed07 |
|
MD5 | 5629aea62a491e660c58e879f658eb38 |
|
BLAKE2b-256 | 35b813d59d1888ea61d2b9e63c543dc65ab1b8ad619d55509e23ab1e072a3019 |