Skip to main content

ESR DT Model

Project description

ESR_DT_MODEL

This package serves as a hub for consolidating all individual model developments associated with the Digital Twin project. Its primary objective is to generate unified and ensemble-based model outputs, which can be seamlessly integrated into any downstream applications.

Note the athe API token must be set up in ~/.pypirc.

Install the package

The package can be installed using pip:

pip install esr_dt_model

Usage:

This package serves as a repository for preserving modeling development processes and allows for the retrieval of information from previous developments.

Save model and related dataset:

The model, training dataset and test dataset can be saved as below:

    import esr_dt_model
    esr_dt_model.export_model(
        "DT",
        "Sijin", 
        trained_model, 
        training_dataset, 
        test_dataset)

Where here DT is the project name, Sijin is the user name, trained_model is a trained model, training_dataset is the dataset used for training the model, test_dataset is the dataset used for testing the model. Note that project name, user name, trained model, training dataset are mandatory arguments, while test_data is optional.

By default, the model and related dataset will be saved in the development channel. When a model is well tested, the model can be saved in the production channel by setting prod to True. For example:

    import esr_dt_model
    esr_dt_model.export_model(
        ...
        prod=True)

List model and related dataset:

We can list all stored model and related dataset as below:

    import esr_dt_model
    esr_dt_model.view_model(
        filters = {
            "project_name": ["DT"],
            "datetime_start": "20231112T0149",
            "datetime_end": "20231112T0250",
        }
    )

The filters here indicates the conditions that we want to put when list the model. The full filters can incldude the arguments including project_name, datetime_start, datetime_end, user, fmt, output_type, for example:

    filters = {
        "project_name": ["DT"],
        "datetime_start": "20231112T0149",
        "datetime_end": "20231112T0250",
        "user": ["Sijin"],
        "fmt": ["pkl", "onnx"],
        "output_type": ["dev", "prod"]
    }

An optional argument key can also be used to specify the columns that you want to view. The full columns include ['project_name', 'version', 'datetime', 'user', 'type', 'fmt', 'output', 'output_type', 'training_data', 'test_data']. By default, all columns will be shown.

Load the model:

The saved model can be loaded as:

import esr_dt_model
esr_dt_model.load_model("D7QVDT")

where D7QVDT is the model version (a unique ID) that can be obtained from running esr_dt_model.view_model.

Appendix: Publish the package (for development only)

The package can be published as:

make publish

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

esr_dt_model-0.0.3.tar.gz (5.2 kB view details)

Uploaded Source

Built Distribution

esr_dt_model-0.0.3-py3-none-any.whl (6.2 kB view details)

Uploaded Python 3

File details

Details for the file esr_dt_model-0.0.3.tar.gz.

File metadata

  • Download URL: esr_dt_model-0.0.3.tar.gz
  • Upload date:
  • Size: 5.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.0

File hashes

Hashes for esr_dt_model-0.0.3.tar.gz
Algorithm Hash digest
SHA256 ad0f94e609941f540b6b0ab899e2099851d6879047e6bdc0f7c9cfd6e7d791ec
MD5 916770bc71c879ee405c3ab1e3a5e36d
BLAKE2b-256 488d1803007d054efcbd47123d5e3edcb2d44c7f9357d9517e6edd97f3f25127

See more details on using hashes here.

File details

Details for the file esr_dt_model-0.0.3-py3-none-any.whl.

File metadata

File hashes

Hashes for esr_dt_model-0.0.3-py3-none-any.whl
Algorithm Hash digest
SHA256 1d0e3bc18894f763897a4729b73f39294ed27a5433366393f36cd879eedae840
MD5 be98d6d21970478cb5339e3a7f65ce7c
BLAKE2b-256 9146209a000b27688aea2f3ea7049e0e2cb107f13cb2dce7a80318a632cfd382

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page