Skip to main content

Utility functions for BAAR developers

Project description

Baarutil

This Custom Library is specifically created for the developers/users who use BAAR. Which is a product of Allied Media Inc.

Authors:

Souvik Roy sroy-2019

Zhaoyu (Thomas) Xu xuzhaoyu

Additional Info:

The string structure that follows is a streamlined structure that the developers/users follow throughout an automation workflow designed in BAAR:

"Column_1__=__abc__$$__Column_2__=__def__::__Column_1__=__hello__$$__Column_2__=__world"

Available functions and the examples are listed below:

1. read_convert(string), Output Data Type: list of dictionary

Attributes:

i. string: Input String, Data Type = String

Input:  "Column_1__=__abc__$$__Column_2__=__def__::__Column_1__=__hello__$$__Column_2__=__world"
Output: [{"Column_1":"abc", "Column_2":"def"}, {"Column_1":"hello", "Column_2":"world"}]

2. write_convert(input_list), Output Data Type: string

Attributes:

i. input_list: List that contains the Dictionaries of Data, Data Type = List

Input:  [{"Column_1":"abc", "Column_2":"def"}, {"Column_1":"hello", "Column_2":"world"}]
Output: "Column_1__=__abc__$$__Column_2__=__def__::__Column_1__=__hello__$$__Column_2__=__world"

3. string_to_df(string, rename_cols, drop_dupes), Output Data Type: pandas DataFrame

Attributes:

i. string: Input String, Data Type = String

ii. rename_cols: Dictionary that contains old column names and new column names mapping, Data Type = Dictionary, Default Value = {}

iii. drop_dupes: Drop duplicate rows from the final dataframe, Data Type = Bool, Default Value = False

Input:  "Column_1__=__abc__$$__Column_2__=__def__::__Column_1__=__hello__$$__Column_2__=__world"

Output:

Column_1 Column_2
abc def
hello world

4. df_to_string(input_df, rename_cols, drop_dupes), Output Data Type: string

Attributes:

i. input_df: Input DataFrame, Data Type = pandas DataFrame

ii. rename_cols: Dictionary that contains old column names and new column names mapping, Data Type = Dictionary, Default Value = {}

iii. drop_dupes: Drop duplicate rows from the final dataframe, Data Type = Bool, Default Value = False

Input:

Column_1 Column_2
abc def
hello world
Output: "Column_1__=__abc__$$__Column_2__=__def__::__Column_1__=__hello__$$__Column_2__=__world"

5. df_to_listdict(input_df, rename_cols, drop_dupes), Output Data Type: list

Attributes:

i. input_df: Input DataFrame, Data Type = pandas DataFrame

ii. rename_cols: Dictionary that contains old column names and new column names mapping, Data Type = Dictionary, Default Value = {}

iii. drop_dupes: Drop duplicate rows from the final dataframe, Data Type = Bool, Default Value = False

Input:

Column_1 Column_2
abc def
hello world
Output: [{"Column_1":"abc", "Column_2":"def"}, {"Column_1":"hello", "Column_2":"world"}]

6. decrypt_vault(encrypted_message, config_file), Output Data Type: string

Attributes:

i. encrypted_message: Encrypted Baar Vault Data, Data Type = string

ii. config_file: Keys, that need to be provided by Allied Media.

This function can also be called from a Robot Framework Script by importing the baarutil library and using the Decrypt Vault keyword. Upon initiation of this function, this will set the Log Level of the Robot Framework script to NONE for security reasons. The Developers have to use Set Log Level INFO in the robot script in order to restart the Log.

Input:  <<Encrypted Text>>
Output: <<Decrypted Text>>

7. generate_password(password_size, upper, lower, digits, symbols, exclude_chars), Output Data Type: string

Attributes:

i. password_size: Password Length, Data Type = int, Default Value = 10, (Should be greater than 4)

ii. upper: Are Uppercase characters required?, Data Type = Bool (True/False), Default Value = True

iii. lower: Are Lowercase characters required?, Data Type = Bool (True/False), Default Value = True

iv. digits: Are Digits characters required?, Data Type = Bool (True/False), Default Value = True

v. symbols: Are Symbols/ Special characters required?, Data Type = Bool (True/False), Default Value = True

vi. exclude_chars: List of characters to be excluded from the final password, Data Type = List, Default Value = []

This function can also be called from a Robot Framework Script by importing the baarutil library and using Generate Password keyword. Upon initiation of this function, this will set the Log Level of the Robot Framework script to NONE for security reasons. The Developers have to use Set Log Level INFO in the robot script in order to restart the Log.

Input (Optional):  <<Password Length>>, <<Uppercase Required?>>, <<Lowercase Required?>>, <<Digits Required?>>, <<Symbols Required?>>
Output: <<Password String>>

8. generate_report(data_df, file_name, path, file_type, detailed_report, replace_old_file, final_file_name_case, time_stamp, encoding, index, engine, max_new_files_count, sheet_name), Output Data Type: Bool or, Dictionary (based on the input value of detailed_report)

Attributes:

i. data_df: Input Dataframe, Data Type = pandas.DataFrame()

ii. file_name: Final file name, Data Type = str

iii. path: Final file path, Data Type = str, Default Value = Current working directory

iv. file_type: Final file extension/ file type, Data Type = str, Default Value = 'csv', Available Options = 'csv' or, 'xlsx'

v. detailed_report: Is detailed status of the run required?, Data Type = Bool (True/False), Default Value = False

vi. replace_old_file: Should the program replace the old files after each run? or keep creating new files (only works if the final file name is the same each time), Data Type = Bool (True/False)

vii. final_file_name_case: Font case of the final file name, Data Type = str, Default Value = 'unchanged', Available Options = 'upper' or, 'lower' or, 'unchanged'

viii. time_stamp: Time stamp at the end of the filename to make each file unique, Data Type = Bool (True/False), Default Value = False

ix. encoding: Encoding of the file, Data Type = str, Default Value = 'utf-8'

x. index: Dataframe index in the final file, Data Type = Bool (True/False), Default Value = False

xi. engine: Engine of the excelwriter for pandas to_excel function, Data Type = str, Default Value = 'openpyxl'

xii. max_new_files_count: Count of maximum new files if the replace_old_file is False, Data Type = int, Default Value = 100

xiii. sheet_name: Sheet name in the final excel, Data Type = str, Default Value = 'Sheet1'

This function can also be called from a Robot Framework Script by importing the baarutil library and using Generate Report.

Input:  Mandetory arguments ->  data_df, file_name
Output (if detailed_report==False):  True/ False
Output (if detailed_report==True):  {'file_path': <<Absolute path of the newly generated file>>, 'file_generation_status': True/ False, 'message': <<Detailed message>>, 'start_time': <<Start time when the function was initiated>>, 'end_time': <<End time when the function was completed>>} 

9. string_to_report(data, file_name, path, file_type, detailed_report, replace_old_file, final_file_name_case, time_stamp, encoding, index, engine, max_new_files_count, sheet_name), Output Data Type: Bool or, Dictionary (based on the input value of detailed_report, rename_cols, drop_dupes)

Attributes:

i. data: Input BAAR string, Data Type = str

ii. file_name: Final file name, Data Type = str

iii. path: Final file path, Data Type = str, Default Value = Current working directory

iv. file_type: Final file extension/ file type, Data Type = str, Default Value = 'csv', Available Options = 'csv' or, 'xlsx'

v. detailed_report: Is detailed status of the run required?, Data Type = Bool (True/False), Default Value = False

vi. replace_old_file: Should the program replace the old files after each run? or keep creating new files (only works if the final file name is the same each time), Data Type = Bool (True/False)

vii. final_file_name_case: Font case of the final file name, Data Type = str, Default Value = 'unchanged', Available Options = 'upper' or, 'lower' or, 'unchanged'

viii. time_stamp: Time stamp at the end of the filename to make each file unique, Data Type = Bool (True/False), Default Value = False

ix. encoding: Encoding of the file, Data Type = str, Default Value = 'utf-8'

x. index: Dataframe index in the final file, Data Type = Bool (True/False), Default Value = False

xi. engine: Engine of the excelwriter for pandas to_excel function, Data Type = str, Default Value = 'openpyxl'

xii. max_new_files_count: Count of maximum new files if the replace_old_file is False, Data Type = int, Default Value = 100

xiii. sheet_name: Sheet name in the final excel, Data Type = str, Default Value = 'Sheet1'

xiv. rename_cols: Dictionary that contains old column names and new column names mapping, Data Type = Dictionary, Default Value = {}

xv. drop_dupes: Drop duplicate rows from the final dataframe, Data Type = Bool, Default Value = False

This function can also be called from a Robot Framework Script by importing the baarutil library and using String To Report.

Input:  Mandetory arguments ->  data (BAAR String: Column_1__=__abc__$$__Column_2__=__def__::__Column_1__=__hello__$$__Column_2__=__world), file_name
Output (if detailed_report==False):  True/ False
Output (if detailed_report==True):  {'file_path': <<Absolute path of the newly generated file>>, 'file_generation_status': True/ False, 'message': <<Detailed message>>, 'start_time': <<Start time when the function was initiated>>, 'end_time': <<End time when the function was completed>>} 

10. clean_directory(path, remove_directory), Output Data Type: boolean

Attributes:

i. path: Absolute paths of the target directories separated by "|", Data Type = str

ii. remove_directory: Should the nested directories be deleted?, Data Type = Bool (True/False), Default Value = False

This function can also be called from a Robot Framework Script by importing the baarutil library and using the Clean Directory keyword.

Input:  "C:/Path1|C:/Path2|C:/Path3"
Output: True/False

11. archive(source, destination, operation_type, dynamic_folder, dynamic_filename, custom_folder_name_prefix, timestamp_format), Output Data Type: boolean & string

Attributes:

i. source: Absolute source path, Data Type = str

ii. destination: Absolute destination path, Data Type = str

iii. operation_type: What type of operation?, Data Type = str, Default Value = 'cut', Available Options = 'cut' or, 'copy'

iv. dynamic_folder: Should there be a folder created within the destination folder in which the archived files will be placed?, Data Type = Bool (True/False), Default Value = 'True'

v. dynamic_filename: Should the files be renamed after being archived with a Timestamp as a Postfix?, Data Type = str, Data Type = Bool (True/False), Default Value = 'False'

vi. custom_folder_name_prefix: What should be the name of the dynamic custom folder if the dynaimc_folder = True?, Data Type = str, Default Value = 'Archive'

vii. timestamp_format: Format of the timestamp for the folder name/ file name postfixes, Data Type = str, Default Value = '%d-%m-%Y_%H.%M.%S', Available Options = any python datetime formats

This function can also be called from a Robot Framework Script by importing the baarutil library and using Archive keyword.

Input:  source="C:/Path1", destination="C:/Path2"
Output1 (completion_flag), Output2 (final_destination): True/False, "C:/Path2/Archive_24-02-2022_17.44.07"

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

baarutil-1.7.0.tar.gz (12.2 kB view details)

Uploaded Source

Built Distribution

baarutil-1.7.0-py3-none-any.whl (10.2 kB view details)

Uploaded Python 3

File details

Details for the file baarutil-1.7.0.tar.gz.

File metadata

  • Download URL: baarutil-1.7.0.tar.gz
  • Upload date:
  • Size: 12.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.7.9

File hashes

Hashes for baarutil-1.7.0.tar.gz
Algorithm Hash digest
SHA256 e7e354d2df872d0800cac92a432c583d5a5fc43e1f0e6917a55749eac25d5c7d
MD5 858f95b8564b215b1d9fca73c7bef569
BLAKE2b-256 241e5c53faaf32a00361d4b4cc19e98bbaacf167a67861f7de2d491550566967

See more details on using hashes here.

File details

Details for the file baarutil-1.7.0-py3-none-any.whl.

File metadata

  • Download URL: baarutil-1.7.0-py3-none-any.whl
  • Upload date:
  • Size: 10.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.7.9

File hashes

Hashes for baarutil-1.7.0-py3-none-any.whl
Algorithm Hash digest
SHA256 de21f78e75d77483ccb2184b92031a9eaade35c643e44b151120064efc5cf7b9
MD5 6d25cf69fed53fb67f2e5203a53ee095
BLAKE2b-256 d010bf527e16633cfc4aa74e2bd5fb08bf5f8de74eab540e1711561a99c963b9

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page