Skip to main content

Client for Kubeflow Model Registry

Project description

Model Registry Python Client

Python License Read the Docs Tutorial Website

This library provides a high level interface for interacting with a model registry server.

Alpha

This Kubeflow component has alpha status with limited support. See the Kubeflow versioning policies. The Kubeflow team is interested in your feedback about the usability of the feature.

Installation

In your Python environment, you can install the latest version of the Model Registry Python client with:

pip install --pre model-registry

Installing extras

Some capabilities of this Model Registry Python client, such as importing model from Hugging Face, require additional dependencies.

By installing an extra variant of this package the additional dependencies will be managed for you automatically, for instance with:

pip install --pre "model-registry[hf]"

This step is not required if you already installed the additional dependencies already, for instance with:

pip install huggingface-hub

Basic usage

Connecting to MR

You can connect to a secure Model Registry using the default constructor (recommended):

from model_registry import ModelRegistry

registry = ModelRegistry("https://server-address", author="Ada Lovelace")  # Defaults to a secure connection via port 443

Or you can set the is_secure flag to False to connect without TLS (not recommended):

registry = ModelRegistry("http://server-address", 8080, author="Ada Lovelace", is_secure=False)  # insecure port set to 8080

Registering models

To register your first model, you can use the register_model method:

model = registry.register_model(
    "my-model",  # model name
    "https://storage-place.my-company.com",  # model URI
    version="2.0.0",
    description="lorem ipsum",
    model_format_name="onnx",
    model_format_version="1",
    storage_key="my-data-connection",
    storage_path="path/to/model",
    metadata={
        # can be one of the following types
        "int_key": 1,
        "bool_key": False,
        "float_key": 3.14,
        "str_key": "str_value",
    }
)

model = registry.get_registered_model("my-model")
print(model)

version = registry.get_model_version("my-model", "2.0.0")
print(version)

experiment = registry.get_model_artifact("my-model", "2.0.0")
print(experiment)

You can also update your models:

# change is not reflected on pushed model version
version.description = "Updated model version"

# you can update it using
registry.update(version)

Importing from S3

When registering models stored on S3-compatible object storage, you should use utils.s3_uri_from to build an unambiguous URI for your artifact.

from model_registry import utils

model = registry.register_model(
    "my-model",  # model name
    uri=utils.s3_uri_from("path/to/model", "my-bucket"),
    version="2.0.0",
    description="lorem ipsum",
    model_format_name="onnx",
    model_format_version="1",
    storage_key="my-data-connection",
    metadata={
        # can be one of the following types
        "int_key": 1,
        "bool_key": False,
        "float_key": 3.14,
        "str_key": "str_value",
    }
)

Importing from Hugging Face Hub

To import models from Hugging Face Hub, start by installing the huggingface-hub package, either directly or as an extra (available as model-registry[hf]). Reference section "installing extras" above for more information.

Models can be imported with

hf_model = registry.register_hf_model(
    "hf-namespace/hf-model",  # HF repo
    "relative/path/to/model/file.onnx",
    version="1.2.3",
    model_name="my-model",
    description="lorem ipsum",
    model_format_name="onnx",
    model_format_version="1",
)

There are caveats to be noted when using this method:

  • It's only possible to import a single model file per Hugging Face Hub repo right now.

Listing models

To list models you can use

for model in registry.get_registered_models():
    ... # your logic using `model` loop variable here

# and versions associated with a model
for version in registry.get_model_versions("my-model"):
    ... # your logic using `version` loop variable here

Advanced usage note: You can also set the page_size() that you want the Pager to use when invoking the Model Registry backend. When using it as an iterator, it will automatically manage pages for you.

Implementation notes

The pager will manage pages for you in order to prevent infinite looping. Currently, the Model Registry backend treats model lists as a circular buffer, and will not end iteration for you.

Development

Common tasks, such as building documentation and running tests, can be executed using nox sessions.

Use nox -l to list sessions and execute them using nox -s [session].

Alternatively, use make install to setup a local Python virtual environment with poetry.

To run the tests you will need docker (or equivalent) and the compose extension command. This is necessary as the test suite will manage a Model Registry server and an MLMD instance to ensure a clean state on each run. You can use make test to execute pytest.

Running Locally on Mac M1 or M2 (arm64 architecture)

Check out our recommendations on setting up your docker engine on an ARM processor.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

model_registry-0.2.10.tar.gz (59.5 kB view details)

Uploaded Source

Built Distribution

model_registry-0.2.10-py3-none-any.whl (138.5 kB view details)

Uploaded Python 3

File details

Details for the file model_registry-0.2.10.tar.gz.

File metadata

  • Download URL: model_registry-0.2.10.tar.gz
  • Upload date:
  • Size: 59.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for model_registry-0.2.10.tar.gz
Algorithm Hash digest
SHA256 ed8995b59a5e86269d21d25cd834afef6f1ef1d888684b060f65f538a47eb9cc
MD5 7c770f937c2d484d8edb963217ed8172
BLAKE2b-256 871be32d4e17263b97e8f709279c334ebfa53e8bd9e9b3a3583b15b0fb442af7

See more details on using hashes here.

Provenance

The following attestation bundles were made for model_registry-0.2.10.tar.gz:

Publisher: python-release.yml on opendatahub-io/model-registry

Attestations:

File details

Details for the file model_registry-0.2.10-py3-none-any.whl.

File metadata

File hashes

Hashes for model_registry-0.2.10-py3-none-any.whl
Algorithm Hash digest
SHA256 771200a3b30a634918e4a8db7045f038eb4514760c3c5bd25563ccdd36dca921
MD5 029aca75272c00c0bf6f89a8917fb02d
BLAKE2b-256 8e4e1cd936f58100ab87ac0b9822bfa38adcad9e638eae79e2fcd3e34d442160

See more details on using hashes here.

Provenance

The following attestation bundles were made for model_registry-0.2.10-py3-none-any.whl:

Publisher: python-release.yml on opendatahub-io/model-registry

Attestations:

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page