Skip to main content

ML_adapter for torch.

Project description

waylay-ml-adapter-torch

Provides the ml_adapter.sklearn module as Waylay ML Adapter for pytorch.

Installation

pip install waylay-ml-adapter-torch

You might want to install additional libraries such as torchaudio or torchvision.

Usage

This ML Adapter uses the standard torch mechanisms to save and load models within a waylay plugin or webscript. The model_path argument defines the file name of the serialized model in the function archive:

  • A model_path ending in weights.pt or weights.pth save/loads the model weights using its state_dict. It is a recommended, more robust method, but requires you to also specifiy a model_class.
  • Any other model_path with .pt or .pth suffix save/loads the entire model. It implicitly saves (references to) the used model class. You'll have to make sure that all dependencies used are also included or declared in the archive.
  • You can also pass an instantiated the model directly to the adapter.

Creating a model for a _webscript

from ml_adapter.torch import V1TorchAdapter

# assuming we save a AutoEncoder torch.nn.Module class in a `autoencoder.py` file
from autoencoder import AutoEncoder
model = AutoEncoder()
# ... train the model ...

# a local directory to prepare the webscript archive
ARCHIVE_LOC='~/webscripts/autoencoder-pytorch'
# use a `weights` model path to use _weights_ serialization
MODEL_PATH='autoencoder.weights.pt'

adapter = V1TorchAdapter(
    model=model,
    model_path='model-weights.pt',
    location=ARCHIVE_LOC,
)

# add our model script to the webscript archive
await adapter.add_script('autoencoder.py')
# write the archive
await adapter.save()
# inspect the archive:
list(adapter.assets)
#> [requirements.txt <ml_adapter.base.assets.python.PythonRequirementsAsset>,
#> main.py <ml_adapter.base.assets.python.PythonScriptAsset>,
#> model-weights.pt <ml_adapter.torch.adapter.TorchModelWeightsAsset>,
#> autoencoder.py <ml_adapter.base.assets.python.PythonScriptAsset>]

Upload the adapter archive as webscript using the ml_tool SDK plugin

from waylay.sdk import WaylayClient
client = WaylayClient.from_profile('staging')
ref = await client.ml_tool.create_webscript(adapter, name='MyAutoEncoder', version='0.0.1')
ref = await client.ml_tool.wait_until_ready(ref)
await client.ml_tool.test_webscript(ref, [2,3,4])

The generated code in main.py uses the following to load your model:

MODEL_PATH = os.environ.get('MODEL_PATH', 'model-weights.pt')
MODEL_CLASS = os.environ.get('MODEL_CLASS', 'autoencoder.AutoEncoder')
adapter = V1TorchAdapter(model_path=MODEL_PATH, model_class=MODEL_CLASS)

You can modify that loading mechanism, e.g. by creating the model your self, and providing it as

adapter = V1TorchAdapter(model=model)

Exported classes

This module exports the following classes:

ml_adapter.torch.V1TorchAdapter

Adapts a callable with torch arrays as input and output.

ml_adapter.torch.V1TorchMarshaller

Convert v1 payload from and to torch tensors.

See also

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

waylay_ml_adapter_torch-0.0.5.tar.gz (5.8 kB view details)

Uploaded Source

Built Distribution

waylay_ml_adapter_torch-0.0.5-py3-none-any.whl (6.2 kB view details)

Uploaded Python 3

File details

Details for the file waylay_ml_adapter_torch-0.0.5.tar.gz.

File metadata

File hashes

Hashes for waylay_ml_adapter_torch-0.0.5.tar.gz
Algorithm Hash digest
SHA256 cb85a79e29227ed48909073b6e5a5ddb17965eb8ec5173f83fffd74536a3f63f
MD5 d6ff59c3c17e1bff5918c12fae0db270
BLAKE2b-256 7dd3336e523b8b0ee8892db35e1695366bd9a2f1fc8c8763bcc8ebd3190ce91c

See more details on using hashes here.

File details

Details for the file waylay_ml_adapter_torch-0.0.5-py3-none-any.whl.

File metadata

File hashes

Hashes for waylay_ml_adapter_torch-0.0.5-py3-none-any.whl
Algorithm Hash digest
SHA256 400908cef1129ef2bd5b347e9ec97b2cef443486aed3b70488d456510182ec85
MD5 085cbcabb7a6a7249f5f1cd592a08810
BLAKE2b-256 78d6f9fbee2c51bdfd53ec8de728129602ebce92dbf909fb76fb73efca5972e1

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page