Skip to main content

This is a sub modular package for developer utilities

Project description

Deep Developer Utilities

This package consists of developer utilities specifically used for data operations and handeling within deep air environment.

Package structure

deepair_dev_utils . ├── general │   ├── init.py │   └── tools.py ├── init.py └── loader ├── init.py └── tools.py

2 directories, 5 files

Dependencies

Note: The following python3 packages are necessary for this package to run:

  • numpy
  • scipy
  • pandas
  • sklearn
  • tqdm

Function Declarations

Here are the signatures for the functions in the package that can be used for deepair-dev.

general.py

Below are the functions that can be accessed by importing this module as from deepair_dev_utils.general.tools import <function_name>.

log:

def log(message):
    '''
        prints message on console
        input :
            message     : msg to print (string)
    '''

get_data:

def get_data(path):
    '''
        Single file loader function
        input :
            path     : abs path to load from (string)
    '''

daterange:

def daterange(s_date, e_date):
    '''
        To return a list of all the dates from
        start date to end date (excluding end date)
        input :
            s_date     : start date (datetime)
            e_date     : end date (datetime)
        returns :
            list of dates
    '''

jsonReader:

def jsonReader(path):
    '''
        JSON File Reader (from absolute path).
        Args:
            path   : absolute path of json file (string)
        Return:
            data   : loaded JSON
    '''

jsonWriter:

def jsonWriter(data, path):
    '''
        JSON File Writer (to absolute path).
        Args:
            data   : data to write (JSON/DICT/STRING)
            path   : absolute path of json file (string)
    '''

ddmmyyyy2datetime:

def ddmmyyyy2datetime(start_date):
    '''
        Convert dd-mm-yyyy to std data time format.
        Args:
            start_date   : date with dd-mm-yyyy (string)
        Return:
            date   : converted format
    '''

Below are the decorators that can be accessed by importing this module as from deepair_dev_utils.general.decorators import <decorator_name>.

function_logger:

def function_logger(orig_func):
    '''
        Create a file with function.log (if possible)
        otherwise with unknown_function.log and record
        the arguments passed for the function

        example:
        @function_logger
        def target_function(...):
            ...
    '''

function_timer:

def function_timer(orig_func):
    '''
        Displays runtime on console

        example:
        @function_timer
        def target_function(...):
            ...
    '''

Loader

This subpackage contains tools for loading data as Handler.

Handler

Below are the functions that can be accessed by importing this module as from deepair_dev_utils.loader.tools import Handler.

Then create an object to access the fuctions. example obj = Handler() and then obj.<function_name>

__init__:

def __init__(self, verbose=True):
    '''
        Handlder (class) constructor.
        inputs:
            verbose: Indicator for log and progress bar (bool)
    '''

loader:

def loader(self, dir_path, start_date, end_date,
           prefix='', postfix='', ext='.csv'):
    '''
        Primary loader function to load the data from start date to
        end date in concatinated (single dataframe) format.
        inputs:
            dir_path    : absolute path to the directory path (series)
            start_date  : load start date in dd-mm-yyyy format (string)
            end_date    : load end date in dd-mm-yyyy format (string)
            prefix      : file prefix [if necessary] (string)
            postfix     : file postfix [if necessary] (string)
            ext         : file extension [default is .csv] (string)
        return:
            df:  loaded concatenated dataframe (pandas df)
    '''

loader_v2:

def loader_v2(self, dir_path, start_date, end_date,
              prefix='', postfix='', ext='.csv'):
    '''
        (VERSION 2)
        Primary loader function to load the data from start date to
        end date in concatinated (single dataframe) format.
        inputs:
            dir_path    : absolute path to the directory path (series)
            start_date  : load start date in yyyy-mm-dd format (string)
            end_date    : load end date in yyyy-mm-dd format (string)
            prefix      : file prefix [if necessary] (string)
            postfix     : file postfix [if necessary] (string)
            ext         : file extension [default is .csv] (string)
        return:
            df:  loaded concatenated dataframe (pandas df)
    '''

single_loader:

def single_loader(self, dir_path, start_date, end_date,
                  prefix='', postfix='', ext='.csv'):
    '''
        Single loader function to load the data from start date to
        end date in individual datewise (each dataframe is of one date)
        format.
        inputs:
            dir_path    : absolute path to the directory path (series)
            start_date  : load start date in dd-mm-yyyy format (string)
            end_date    : load end date in dd-mm-yyyy format (string)
            prefix      : file prefix [if necessary] (string)
            postfix     : file postfix [if necessary] (string)
            ext         : file extension [default is .csv] (string)
        return:
            data:  list of data frames datewise (list)
    '''

batch_loader:

def batch_loader(self, dir_path, start_date, end_date,
                 batch_size=1, prefix='', postfix='', ext='.csv'):
    '''
        Batch loader function to load the data from start date to
        end date in batches (each dataframe is in the form of batch datewise)
        format.
        inputs:
            dir_path    : absolute path to the directory path (series)
            start_date  : load start date in dd-mm-yyyy format (string)
            end_date    : load end date in dd-mm-yyyy format (string)
            batch_size  : batch size (int)
            prefix      : file prefix [if necessary] (string)
            postfix     : file postfix [if necessary] (string)
            ext         : file extension [default is .csv] (string)
        return:
            data:  list of data frames datewise (list)
    '''

_load_action:

def _load_action(self, df):
    '''
        @abstractmethod
        User defined Bottle neck pipeline within load.
        NOTE -> Default job of this function is pass i.e. do nothing
        inputs:
            df:  Dataframe to apply this method on (pandas df)
        return:
            df:  Modified dataframe (pandas df)
    '''

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

deepair_dev_utils-0.0.7.tar.gz (5.6 kB view details)

Uploaded Source

Built Distribution

deepair_dev_utils-0.0.7-py3-none-any.whl (8.2 kB view details)

Uploaded Python 3

File details

Details for the file deepair_dev_utils-0.0.7.tar.gz.

File metadata

  • Download URL: deepair_dev_utils-0.0.7.tar.gz
  • Upload date:
  • Size: 5.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.11.0 pkginfo/1.4.2 requests/2.19.1 setuptools/40.5.0 requests-toolbelt/0.8.0 tqdm/4.24.0 CPython/2.7.12

File hashes

Hashes for deepair_dev_utils-0.0.7.tar.gz
Algorithm Hash digest
SHA256 4428c7e09bc9da531bea84f9a64d39c295cc92ec0179997c85684eeb4c6d6c72
MD5 f102e4ee4e3ebca59dc730d15fb75569
BLAKE2b-256 3b86b44c5f43d2feab93f675e0bd86bc11d33fea67461eaa445f659fe79687d8

See more details on using hashes here.

File details

Details for the file deepair_dev_utils-0.0.7-py3-none-any.whl.

File metadata

  • Download URL: deepair_dev_utils-0.0.7-py3-none-any.whl
  • Upload date:
  • Size: 8.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.11.0 pkginfo/1.4.2 requests/2.19.1 setuptools/40.5.0 requests-toolbelt/0.8.0 tqdm/4.24.0 CPython/2.7.12

File hashes

Hashes for deepair_dev_utils-0.0.7-py3-none-any.whl
Algorithm Hash digest
SHA256 f02b76649ec92dbd6059dbd8b25eb62f249bb5016e4050492f8a8a2d1ee51261
MD5 4fdb78f309d934b50c19a3b57c40885e
BLAKE2b-256 7a199bf3fc2fe8c27af85f9a438697e697590fd2abee6eb090d86239cb7bfe33

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page