Skip to main content

This is a sub modular package for developer utilities

Project description

Deep Developer Utilities

This package consists of developer utilities specifically used for data operations and handeling within deep air environment.

Package structure

deepair_dev_utils . ├── general │   ├── init.py │   └── tools.py ├── init.py └── loader ├── init.py └── tools.py

2 directories, 5 files

Dependencies

Note: The following python3 packages are necessary for this package to run:

  • numpy
  • scipy
  • pandas
  • sklearn
  • tqdm

Function Declarations

Here are the signatures for the functions in the package that can be used for deepair-dev.

general.py

Below are the functions that can be accessed by importing this module as from deepair_dev_utils.general.tools import <function_name>.

log:

def log(message):
    '''
        prints message on console
        input :
            message     : msg to print (string)
    '''

get_data:

def get_data(path):
    '''
        Single file loader function
        input :
            path     : abs path to load from (string)
    '''

daterange:

def daterange(s_date, e_date):
    '''
        To return a list of all the dates from
        start date to end date (excluding end date)
        input :
            s_date     : start date (datetime)
            e_date     : end date (datetime)
        returns :
            list of dates
    '''

Loader

This subpackage contains tools for loading data as Handler.

Handler

Below are the functions that can be accessed by importing this module as from deepair_dev_utils.loader.tools import Handler.

Then create an object to access the fuctions. example obj = Handler() and then obj.<function_name>

__init__:

def __init__(self, verbose=True):
    '''
        Handlder (class) constructor.
        inputs:
            verbose: Indicator for log and progress bar (bool)
    '''

loader:

def loader(self, dir_path, start_date, end_date,
           prefix='', postfix='', ext='.csv'):
    '''
        Primary loader function to load the data from start date to
        end date in concatinated (single dataframe) format.
        inputs:
            dir_path    : absolute path to the directory path (series)
            start_date  : load start date in dd-mm-yyyy format (string)
            end_date    : load end date in dd-mm-yyyy format (string)
            prefix      : file prefix [if necessary] (string)
            postfix     : file postfix [if necessary] (string)
            ext         : file extension [default is .csv] (string)
        return:
            df:  loaded concatenated dataframe (pandas df)
    '''

single_loader:

def single_loader(self, dir_path, start_date, end_date,
                  prefix='', postfix='', ext='.csv'):
    '''
        Single loader function to load the data from start date to
        end date in individual datewise (each dataframe is of one date)
        format.
        inputs:
            dir_path    : absolute path to the directory path (series)
            start_date  : load start date in dd-mm-yyyy format (string)
            end_date    : load end date in dd-mm-yyyy format (string)
            prefix      : file prefix [if necessary] (string)
            postfix     : file postfix [if necessary] (string)
            ext         : file extension [default is .csv] (string)
        return:
            data:  list of data frames datewise (list)
    '''

batch_loader:

def batch_loader(self, dir_path, start_date, end_date,
                 batch_size=1, prefix='', postfix='', ext='.csv'):
    '''
        Batch loader function to load the data from start date to
        end date in batches (each dataframe is in the form of batch datewise)
        format.
        inputs:
            dir_path    : absolute path to the directory path (series)
            start_date  : load start date in dd-mm-yyyy format (string)
            end_date    : load end date in dd-mm-yyyy format (string)
            batch_size  : batch size (int)
            prefix      : file prefix [if necessary] (string)
            postfix     : file postfix [if necessary] (string)
            ext         : file extension [default is .csv] (string)
        return:
            data:  list of data frames datewise (list)
    '''

_load_action:

def _load_action(self, df):
    '''
        @abstractmethod
        User defined Bottle neck pipeline within load.
        NOTE -> Default job of this function is pass i.e. do nothing
        inputs:
            df:  Dataframe to apply this method on (pandas df)
        return:
            df:  Modified dataframe (pandas df)
    '''

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

deepair_dev_utils-0.0.2.tar.gz (4.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

deepair_dev_utils-0.0.2-py3-none-any.whl (6.1 kB view details)

Uploaded Python 3

File details

Details for the file deepair_dev_utils-0.0.2.tar.gz.

File metadata

  • Download URL: deepair_dev_utils-0.0.2.tar.gz
  • Upload date:
  • Size: 4.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.11.0 pkginfo/1.4.2 requests/2.19.1 setuptools/40.0.0 requests-toolbelt/0.8.0 tqdm/4.24.0 CPython/2.7.12

File hashes

Hashes for deepair_dev_utils-0.0.2.tar.gz
Algorithm Hash digest
SHA256 7d95a772162b55c13ed2df979c96c9e02cfa1a6eaefd6988e276292ba207f9d2
MD5 22fd699dccc2aae88efc1d1c58e3ff6c
BLAKE2b-256 4845ca64fb59ac90f1249993be8c98be8ff929f6587f18fb4d123aa0ece144f3

See more details on using hashes here.

File details

Details for the file deepair_dev_utils-0.0.2-py3-none-any.whl.

File metadata

  • Download URL: deepair_dev_utils-0.0.2-py3-none-any.whl
  • Upload date:
  • Size: 6.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.11.0 pkginfo/1.4.2 requests/2.19.1 setuptools/40.0.0 requests-toolbelt/0.8.0 tqdm/4.24.0 CPython/2.7.12

File hashes

Hashes for deepair_dev_utils-0.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 640efefcccfa8e3fa9e255662c31798ed43e225526086fdb5c7bc87f4cdf8ac0
MD5 9167dbeeeb8deddc58499602db4f8c51
BLAKE2b-256 5bd82ff64fd704bf8145f6a5859975ae8192fadc7a7413720b9e6f1379c88ad9

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page