Skip to main content

Machine learning model contracts with machine learning infrastructure

Project description

Twinn-ml-interface

PyPI Latest Release Downloads License Code style: black

Twinn-ml-interface is a Python package for data contracts between machine learning code and infrastructure. This contract ensures smooth onboarding of machine learning models onto the Twinn-ml-platform by Royal HaskoningDHV.

Author: Royal HaskoningDHV

Installation

The easiest way to install this package is using pip:

pip install twinn-ml-interface

Model Interface

Purpose

The Model Interface defines the required methods and attributes that any ML model needs to have in order to run in the Royal HaskoningDHV Twinn-ml infrastructure.

Testing compliance of your model with the data contract

Instance of the Model Interface

Once all the attributes and methods from the Protocol ModelInterfaceV4 are implemented, including the correct type-hints / annotations, we can check if a model is compliant with the interface by doing an isinstance check with ModelInterfaceV4. You can find a base test in twinn_ml_interface/interface/model_test.py. The Darrow-Poc is an example of a model that follows the ModelInterfaceV4.

Mock Executors

The executor class takes care of running the model either for training or predictions in the Twinn-ml infrastructure. Here, we implemented a mock executor to emulate that behaviour to some extent, which hopefully makes it a little clearer in what context the model class will be used. Any model compliant with the ModelInterface should be able to train and predict using the ExecutorMock that can be found in twinn_ml_interface/mocks/mocks.py. The Darrow-Poc is an example of a model that follows ModelInterfaceV4 and can run using the ExecutorMock.

The steps and methods that the infrastructure and the mock executor run during training are:

  1. Read config:
    • get_target_template()
    • get_train_window_finder_config_template()
  2. Initialize the model
    • initialize()
  3. Given the configuration for the train window finder in the previous steps, validate possible windows:
    • validate_input_data()
  4. Read the data configuration to download all the needed data in a window selected by the previous step:
    • get_data_config_template()
  5. Transform the input data as needed:
    • preprocess()
  6. Train:
    • train()
  7. Store the model:
    • dump()

When the training is finished, the model can be used for predicting. The prediction steps are:

  1. Retrieve the model from storage and load it:
    • load()
  2. Fetch the data needed for prediction based on either:
    • base_features - if present
    • get_data_config_template() - otherwise
  3. Predict:
    • predict()
  4. Load configuration to post predictions:
    • get_result_template()

Example of the Model Interface

Darrow Poc

The Darrow-Poc is an example of a model that follows ModelInterfaceV4. It contains more detailed explanations of the data model, interface methods and the onboarding process.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

twinn_ml_interface-0.6.0.tar.gz (17.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

twinn_ml_interface-0.6.0-py3-none-any.whl (19.1 kB view details)

Uploaded Python 3

File details

Details for the file twinn_ml_interface-0.6.0.tar.gz.

File metadata

  • Download URL: twinn_ml_interface-0.6.0.tar.gz
  • Upload date:
  • Size: 17.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for twinn_ml_interface-0.6.0.tar.gz
Algorithm Hash digest
SHA256 92685df23c6d4a2a8a6058486b799e06a30fd624868f9815bf3ad7260d2910e3
MD5 e49e0a7b58ba86bf56df378060c88661
BLAKE2b-256 ef5e6fc9cf6697af8f6c5f9f472dff2e7e187aaf3fecf358df56e10bbf022ada

See more details on using hashes here.

Provenance

The following attestation bundles were made for twinn_ml_interface-0.6.0.tar.gz:

Publisher: publish-to-pypi.yml on RoyalHaskoningDHV/twinn-ml-interface

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file twinn_ml_interface-0.6.0-py3-none-any.whl.

File metadata

File hashes

Hashes for twinn_ml_interface-0.6.0-py3-none-any.whl
Algorithm Hash digest
SHA256 ebeb09e9650ccc8ba079caabe858511c67a1583a56abe1d13d419cdb670a63ff
MD5 4cfd910f75cbfcb3db986a97ec10962e
BLAKE2b-256 d4264585c5816565a4a97fa3cb5ffdee31c51100ea2f8e8a80393f8bb98d2829

See more details on using hashes here.

Provenance

The following attestation bundles were made for twinn_ml_interface-0.6.0-py3-none-any.whl:

Publisher: publish-to-pypi.yml on RoyalHaskoningDHV/twinn-ml-interface

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page