Skip to main content

This library contains reusable code for various projects

Project description

Common Python Helper Functions

This library contains reused code for all the EASIER-AI projects written in Python.

Using the library

Install

This library is available through PIP package manager. To install it, execute

pip install easierai-common-functions

Importing

The library needs to be imported in order to use it:

import common_functions.helpers as helpers

from minio import Minio

from common_functions.logger import Logger

Then, there needs to be some configuration:

helpers.config = helpers.read_config_file(config_file_path)

helpers.minioClient = Minio(minio_host + ':' + minio_port, minio_access, minio_secret, secure=False)

helpers._logger = Logger('helpers', 'helpers.py')

If you wish to check the validity of the configuration provided for an inferencer, you can use this method (it will notify at start if there are no valid models):

helpers.check_initial_config(eslib, False)

Where eslib is a valid started instance of the elasticsearch library.

Necessary environmental variables

This library reads from the following environmental variables:

  • LOGSTASH_HOST: IP/hostname hosting the Logstash service to upload the logs
  • LOGSTASH_PORT: port of where the Logstash service is listening

Usage

The library has these functions available:

get_data_shape(data_type, num_features, num_samples, algorithm)

Outputs the data_shape required according to the parameters passed.

importer(algorithm, inference_type=constants.ESTIMATOR, lr=0.001)

Returns a predictor instance according to the parameters passed.

load_model_file(eslib, id, inference_type=constants.ESTIMATOR)

Returns the document stored on Elasticsearch, which includes, among other things, the h5 and pkl files where we had saved the model and the scalers, respectively. :param id: id of the entity :param inference_type: this param helps us to download the right model according to its features and parameters :return: dict with the format {"extension": object, "extension2": object2}

read_config_file(config_path)

Initializes the config variable

check_initial_config(eslib, is_classifier)

Checks if there is any model in the database that matches the configuration provided.

scale_dataset(scaler, data, i, ft_range=(-1, 1), training=True)

Scale data (in np.array or list format) using MinMaxScaler and the ft_range given (default is (-1,1) :param data: array of data to be scaled :param i: feature corresponding to the scaler :param ft_range: tuple containing minimum and maximum values of the data already scaled :return: tuple of scaler and data scaled

compose_model_params(is_classifier)

Composes a json object with the parameters of a trained model to store in the database

compose_model_params_filter(is_classifier)

Composes a json object with the parameters in the config file used to look for the models in the database.

save_model(eslib, id, metadata, dict={}, inference_type=constants.ESTIMATOR, _id=None, save_tflite=False, calibration_data=None)

Saves the model related files after training. Has the ability to save a model as tflite format. The parameter dict should come in the format {"extension1": object1, "extension2": object2 ... }. The calibration_data is only used when saving a model as tflite format, and should be a representation of the dataset.

Additional features

Constants file

The constants used on EASIER-AI services are stored in common_functions/constants.py file. It can be imported as:

import common_functions.constants as constants

Advanced logger

This logger has the same syntax as the default logging python library. It needs to be imported and initialized as:

from common_functions.logger import Logger

logger = Logger(service_name, filename)

It then can be used as logger.info(message), logger.debug(message, additional_info), etc.

This logger, apart from printing to console, uploads each log instance to Elasticsearch via Logstash, through a TCP port. To use this functionality it is needed to define the previously mentioned LOGSTASH_HOST and LOGSTASH_PORT environment variables.

Edge toolkit

This class is in charge of converting a tensorflow or keras model into tensorflow lite. It can be used as:

from edge_tools import Edge_Toolkit

edge_toolkit = Edge_Toolkit(logger)

edge_toolkit.convert_model_lite(calibration_data=calibration_data, keras_model_path=filename + '.' + constants.MODEL_EXTENSION)

After executing these lines, the tflite file will be stored in ../storage/ and can be uploaded to a remote filesystem.

Model definitions

The model definitions used by EASIER are also stored in this library. They are imported by the helpers file using the importer( ... ) function.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

easierai_common_functions-1.8.4.tar.gz (12.0 kB view details)

Uploaded Source

Built Distribution

easierai_common_functions-1.8.4-py3-none-any.whl (18.6 kB view details)

Uploaded Python 3

File details

Details for the file easierai_common_functions-1.8.4.tar.gz.

File metadata

  • Download URL: easierai_common_functions-1.8.4.tar.gz
  • Upload date:
  • Size: 12.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/2.0.0 pkginfo/1.5.0.1 requests/2.23.0 setuptools/45.2.0 requests-toolbelt/0.9.1 tqdm/4.43.0 CPython/3.6.9

File hashes

Hashes for easierai_common_functions-1.8.4.tar.gz
Algorithm Hash digest
SHA256 3373a040c25449d7f9ffa910947afd9edcb5e65f2c06b46783787210295e9cc8
MD5 6739c7f512575bd7a0f7ebcfc72e6391
BLAKE2b-256 d5d907aa701d323d6a5fdc7b79f16400c9f64cd5ef784bcec1c4c09cba75f075

See more details on using hashes here.

File details

Details for the file easierai_common_functions-1.8.4-py3-none-any.whl.

File metadata

  • Download URL: easierai_common_functions-1.8.4-py3-none-any.whl
  • Upload date:
  • Size: 18.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/2.0.0 pkginfo/1.5.0.1 requests/2.23.0 setuptools/45.2.0 requests-toolbelt/0.9.1 tqdm/4.43.0 CPython/3.6.9

File hashes

Hashes for easierai_common_functions-1.8.4-py3-none-any.whl
Algorithm Hash digest
SHA256 6e33f9e9bbbced47096cd206c0bcb9fe2c25ec145b482d422f2e3bcd55c36aab
MD5 b4c108903f778ca3c19b9de89ea604bf
BLAKE2b-256 3ad62bbe639b4203f14b9fc96cc97d0ac3fb1ee93d265144349255d7824cc682

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page