Skip to main content

A collection of simple but powerful utilities for Data Analytics and Data Science workflows.

Project description

nimble toolkit

A collection of simple but powerful utilities for ML and Data Analytics workflows.

Installation

pip install nimble-tk

Examples

Logging

The below set of cells give some utility wrappers on python's logging functionality. The following line will create a log file at /tmp/try_nimble.log

  • As the logs get written, the log file will go up to max 50 MB in size by default and then will get rolled over.
  • At most, 10 such files are maintained before removing the earliest file.
  • Logs will also be written to the console (in this case the notebook console)
ntk.init_file_logger(log_file_path='/tmp/try_nimble.log', console_log_on=True)

Output:

2023-12-09 11:10:43.270595 :: MainThread ::  Log file init at /tmp/try_nimble.log

Example of an info log:
ntk.log_info("some log message")

Output:

2023-12-09 11:10:43.289434 :: MainThread ::  some log message

Example of an error log:
Catching an exception and logging the stack trace:
try:
    tmp = 1/0
except:
    ntk.log_traceback("Error while running operation")

Output:

2023-12-09 11:10:43.342127 :: MainThread ::  Error while running operation :: ZeroDivisionError: division by zero ::
 Traceback (most recent call last):
  File "/tmp/ipykernel_3709/4084614229.py", line 2, in <module>
    tmp = 1/0
ZeroDivisionError: division by zero

All logs are also written to the log file:
! tail -10 /tmp/try_nimble.log

Output:

2023-12-09 11:10:43,268 MainThread logger.py:88:  Log file init at /tmp/try_nimble.log
2023-12-09 11:10:43,284 MainThread logger.py:88:  some log message
2023-12-09 11:10:43,311 MainThread logger.py:88:  some error message
2023-12-09 11:10:43,334 MainThread logger.py:88:  Error while running operation :: ZeroDivisionError: division by zero ::
 Traceback (most recent call last):
  File "/tmp/ipykernel_3709/4084614229.py", line 2, in <module>
    tmp = 1/0
ZeroDivisionError: division by zero

Logs can also be directly written to the file without logging on the console. This helps in cases where we do not want to flood the console with a lot of logs.
ntk.log_info_file("some log message")
ntk.log_error_file("some error message")

try:
    tmp = 1/0
except:
    ntk.log_traceback_file("Error while running operation")

! tail -7 /tmp/try_nimble.log

Output:

2023-12-09 11:10:43,501 MainThread logger.py:88:  some log message
2023-12-09 11:10:43,505 MainThread logger.py:88:  some error message
2023-12-09 11:10:43,510 MainThread logger.py:88:   :: ZeroDivisionError: division by zero ::
 Traceback (most recent call last):
  File "/tmp/ipykernel_3709/3515687902.py", line 5, in <module>
    tmp = 1/0
ZeroDivisionError: division by zero

Concurrent Processing

The ntk.run_concurrently(...) method provides a wrapper around python's multi-processing and multi-threading functionality.

Set up a list of functions to execute:

def analytic_function(customer_id):
    ntk.log_info_file(f"Running analytic_function for customer_id: {customer_id}")
    
    if customer_id == 0:
        # raising an error here to demonstrate error handling
        raise ValueError(f"Invalid customer id {customer_id}")
        
    # - Query the DB
    # - Do feature engineering
    # - Run other analytics
    
    # - Write the result data to disk or return a DF
    return pd.DataFrame({
        'CUSTOMER_ID': [customer_id]*5,
        'OTHER_DATA': np.random.randint(0, 9, 5)
    })

functions = []
for customer_id in range(10):
    functions.append((analytic_function, {'customer_id': customer_id}))

Run the functions concurrently with the following lines of code:

results, errors = ntk.run_concurrently(functions, max_workers=ntk.get_num_cpus(), fork=True)

df_result = pd.concat([result[1] for result in results])
df_result.head(index=False)

fork=True will launch multiple processes and is recommended to be used for CPU-bound tasks.

fork=False will launch multiple threads, should to be used for IO-bound tasks.


Output:
2023-12-09 11:10:46.767042 :: MainThread ::  Ran: 10 functions - Successfull: 9, Failed: 1
CUSTOMER_ID OTHER_DATA
1 3
1 0
1 3
1 1
1 3
... ...

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nimble_tk-0.0.1.tar.gz (23.2 kB view details)

Uploaded Source

Built Distribution

nimble_tk-0.0.1-py3-none-any.whl (20.7 kB view details)

Uploaded Python 3

File details

Details for the file nimble_tk-0.0.1.tar.gz.

File metadata

  • Download URL: nimble_tk-0.0.1.tar.gz
  • Upload date:
  • Size: 23.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.12

File hashes

Hashes for nimble_tk-0.0.1.tar.gz
Algorithm Hash digest
SHA256 6859e75bc184d97b28cc9c6a9684e1ea66d68a419bf5df953762551fa0720e29
MD5 fde7fb41bfa5544fdd793caa5d427fe7
BLAKE2b-256 eb93aa93f189784edd7c3f1811aa8531d5388a1aa12a6505c2d88430df5cf0e5

See more details on using hashes here.

File details

Details for the file nimble_tk-0.0.1-py3-none-any.whl.

File metadata

  • Download URL: nimble_tk-0.0.1-py3-none-any.whl
  • Upload date:
  • Size: 20.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.12

File hashes

Hashes for nimble_tk-0.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 4174963ee8fdcf8fa3e2777c28a1ed272f82f42f3010726d1294e706a6041601
MD5 92320d0943cb6ecc3d71f37654eb7b4c
BLAKE2b-256 ecc7e71fd3f46c12b23bc3e66c17d91f3020df01c59f4d95e58195827f861244

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page