Skip to main content

Utilities for exporting and annotating ONNX models for the Koios IoT platform

Project description

koios-model-utils

Utilities for exporting and annotating ONNX models for the Koios IoT platform.

This package helps data scientists prepare trained ML models (especially Stable-Baselines3 reinforcement learning models) for deployment on Koios. It handles ONNX export, metadata embedding, and normalization parameter annotation so the Koios Predict Engine can run inference correctly.

Installation

# Metadata embedding only (no torch/SB3 dependency)
pip install koios-model-utils

# Full SB3 export pipeline
pip install koios-model-utils[sb3]

Quick Start

Embed metadata into an existing ONNX model

import onnx
from koios_model_utils import (
    InputBinding, OutputBinding, TrainingMeta,
    embed_koios_metadata,
)

model = onnx.load("my_model.onnx")

inputs = [
    InputBinding(name="temperature", description="Tank temperature (C)"),
    InputBinding(name="pressure", description="Vessel pressure (kPa)"),
]
outputs = [
    OutputBinding(
        name="valve_position",
        range_min=0.0, range_max=100.0,
        normalization_type="symmetric",
        normalization_source="custom",
        custom_minimum=0.0, custom_maximum=100.0,
        clamp_output=True,
    ),
]
training = TrainingMeta(
    scenario_name="tank_temperature",
    algorithm="PPO",
    obs_depth=5,
    sample_rate=1.0,
)

embed_koios_metadata(model, inputs=inputs, outputs=outputs, training=training)
onnx.save(model, "my_model_koios.onnx")

Export an SB3 model to ONNX

from koios_model_utils import (
    Algorithm,
    TrainingMeta,
    build_input_bindings,
    build_output_bindings,
    export_onnx,
)

inputs = build_input_bindings(
    ["temperature", "level", "flow_rate"],
    descriptions={"temperature": "Tank temp (C)", "level": "Tank level (%)"},
)
outputs = build_output_bindings(
    ["valve_position"],
    action_ranges={"valve_position": (0.0, 100.0)},
)

export_onnx(
    "runs/best_model.zip",
    inputs=inputs,
    outputs=outputs,
    training=TrainingMeta(
        scenario_name="tank_temperature",
        algorithm=Algorithm.PPO,
        obs_depth=5,
    ),
)

CLI

# Export with auto-detected settings
koios-export runs/best_model.zip

# Specify output path and opset version
koios-export runs/best_model.zip --output model.onnx --opset 18

SB3 Training Callbacks

from koios_model_utils import ObsRangeCallback, VecNormalizeSyncCallback

# Track observation ranges during training
obs_range_cb = ObsRangeCallback(run_dir, verbose=1)

# Sync VecNormalize stats with best model checkpoints
vec_sync_cb = VecNormalizeSyncCallback(eval_callback, verbose=1)

API Reference

Metadata Types

Class Description
InputBinding Describes one input (observation) binding — name, normalization, failure bounds
OutputBinding Describes one output (action) binding — name, range, normalization, clamping
TrainingMeta Model-level metadata — algorithm, obs_depth, sample_rate, model_type

Functions

Function Description
embed_koios_metadata() Embed koios.training and koios.bindings metadata into an ONNX model
build_input_bindings() Build input bindings from node names + optional normalization stats
build_output_bindings() Build output bindings from action names + ranges
export_onnx() Full SB3-to-ONNX export pipeline (requires [sb3] extra)
detect_algorithm() Detect SB3 algorithm (PPO, SAC, TD3, etc.) from a saved .zip model

Callbacks (requires [sb3] extra)

Callback Description
ObsRangeCallback Track per-feature min/max of observations during training
VecNormalizeSyncCallback Save VecNormalize stats alongside best model checkpoints

Development

# Install with all dependencies
pip install -e ".[sb3,dev]"

# Run tests
make test

# Lint
make lint

License

MIT License - see LICENSE for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

koios_model_utils-1.0.0.tar.gz (25.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

koios_model_utils-1.0.0-py3-none-any.whl (21.4 kB view details)

Uploaded Python 3

File details

Details for the file koios_model_utils-1.0.0.tar.gz.

File metadata

  • Download URL: koios_model_utils-1.0.0.tar.gz
  • Upload date:
  • Size: 25.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for koios_model_utils-1.0.0.tar.gz
Algorithm Hash digest
SHA256 5ae53ef1efe2f538c38043139d4c426f7f804a57ecdfba4580f918f5bb58497f
MD5 a7f756e4af7073d5112db125f435e365
BLAKE2b-256 842d4e0b0bcec252b7a8a71afbc6e92dbabb481592dd283805a53455961d049c

See more details on using hashes here.

Provenance

The following attestation bundles were made for koios_model_utils-1.0.0.tar.gz:

Publisher: release.yml on Ai-Ops-Inc/koios-model-utils

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file koios_model_utils-1.0.0-py3-none-any.whl.

File metadata

File hashes

Hashes for koios_model_utils-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 c280667c1722132867da74e2963f9b41673145bce9c09f294451030290cd8f3b
MD5 49884794cd0d3d2629deb0c1b0f17a6a
BLAKE2b-256 ececa37f0597c1c723137d0944b4122851f4c0516f865de8adfd954939686f0d

See more details on using hashes here.

Provenance

The following attestation bundles were made for koios_model_utils-1.0.0-py3-none-any.whl:

Publisher: release.yml on Ai-Ops-Inc/koios-model-utils

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page