Skip to main content

Client for Kubeflow Model Registry

Project description

Model Registry Python Client

Python License Read the Docs Tutorial Website

This library provides a high level interface for interacting with a model registry server.

Alpha

This Kubeflow component has alpha status with limited support. See the Kubeflow versioning policies. The Kubeflow team is interested in your feedback about the usability of the feature.

Installation

In your Python environment, you can install the latest version of the Model Registry Python client with:

pip install --pre model-registry

Installing extras

Some capabilities of this Model Registry Python client, such as importing model from Hugging Face, require additional dependencies.

By installing an extra variant of this package the additional dependencies will be managed for you automatically, for instance with:

pip install --pre "model-registry[hf]"

This step is not required if you already installed the additional dependencies already, for instance with:

pip install huggingface-hub

Basic usage

Connecting to MR

You can connect to a secure Model Registry using the default constructor (recommended):

from model_registry import ModelRegistry

registry = ModelRegistry("https://server-address", author="Ada Lovelace")  # Defaults to a secure connection via port 443

Or you can set the is_secure flag to False to connect without TLS (not recommended):

registry = ModelRegistry("http://server-address", 8080, author="Ada Lovelace", is_secure=False)  # insecure port set to 8080

Registering models

To register your first model, you can use the register_model method:

model = registry.register_model(
    "my-model",  # model name
    "https://storage-place.my-company.com",  # model URI
    version="2.0.0",
    description="lorem ipsum",
    model_format_name="onnx",
    model_format_version="1",
    storage_key="my-data-connection",
    storage_path="path/to/model",
    metadata={
        # can be one of the following types
        "int_key": 1,
        "bool_key": False,
        "float_key": 3.14,
        "str_key": "str_value",
    }
)

model = registry.get_registered_model("my-model")
print(model)

version = registry.get_model_version("my-model", "2.0.0")
print(version)

experiment = registry.get_model_artifact("my-model", "2.0.0")
print(experiment)

You can also update your models:

# change is not reflected on pushed model version
version.description = "Updated model version"

# you can update it using
registry.update(version)

Importing from S3

When registering models stored on S3-compatible object storage, you should use utils.s3_uri_from to build an unambiguous URI for your artifact.

from model_registry import utils

model = registry.register_model(
    "my-model",  # model name
    uri=utils.s3_uri_from("path/to/model", "my-bucket"),
    version="2.0.0",
    description="lorem ipsum",
    model_format_name="onnx",
    model_format_version="1",
    storage_key="my-data-connection",
    metadata={
        # can be one of the following types
        "int_key": 1,
        "bool_key": False,
        "float_key": 3.14,
        "str_key": "str_value",
    }
)

Importing from Hugging Face Hub

To import models from Hugging Face Hub, start by installing the huggingface-hub package, either directly or as an extra (available as model-registry[hf]). Reference section "installing extras" above for more information.

Models can be imported with

hf_model = registry.register_hf_model(
    "hf-namespace/hf-model",  # HF repo
    "relative/path/to/model/file.onnx",
    version="1.2.3",
    model_name="my-model",
    description="lorem ipsum",
    model_format_name="onnx",
    model_format_version="1",
)

There are caveats to be noted when using this method:

  • It's only possible to import a single model file per Hugging Face Hub repo right now.

Listing models

To list models you can use

for model in registry.get_registered_models():
    ... # your logic using `model` loop variable here

# and versions associated with a model
for version in registry.get_model_versions("my-model"):
    ... # your logic using `version` loop variable here

Advanced usage note: You can also set the page_size() that you want the Pager to use when invoking the Model Registry backend. When using it as an iterator, it will automatically manage pages for you.

Implementation notes

The pager will manage pages for you in order to prevent infinite looping. Currently, the Model Registry backend treats model lists as a circular buffer, and will not end iteration for you.

Development

Common tasks, such as building documentation and running tests, can be executed using nox sessions.

Use nox -l to list sessions and execute them using nox -s [session].

Alternatively, use make install to setup a local Python virtual environment with poetry.

To run the tests you will need docker (or equivalent) and the compose extension command. This is necessary as the test suite will manage a Model Registry server and an MLMD instance to ensure a clean state on each run. You can use make test to execute pytest.

Running Locally on Mac M1 or M2 (arm64 architecture)

Check out our recommendations on setting up your docker engine on an ARM processor.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

model_registry-0.2.12.tar.gz (60.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

model_registry-0.2.12-py3-none-any.whl (139.0 kB view details)

Uploaded Python 3

File details

Details for the file model_registry-0.2.12.tar.gz.

File metadata

  • Download URL: model_registry-0.2.12.tar.gz
  • Upload date:
  • Size: 60.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.0.1 CPython/3.12.8

File hashes

Hashes for model_registry-0.2.12.tar.gz
Algorithm Hash digest
SHA256 84d418d1d73dea38555fd7485ee79fa6328c5e20db8f47ea1ae6e9d371d611b0
MD5 ecc927787e959c97c394df863d85be0c
BLAKE2b-256 d4734e55cec251ae0e4e916e8effa3146eb8bd9013bb25d755b3c8c1dd9ad7f9

See more details on using hashes here.

Provenance

The following attestation bundles were made for model_registry-0.2.12.tar.gz:

Publisher: python-release.yml on opendatahub-io/model-registry

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file model_registry-0.2.12-py3-none-any.whl.

File metadata

  • Download URL: model_registry-0.2.12-py3-none-any.whl
  • Upload date:
  • Size: 139.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.0.1 CPython/3.12.8

File hashes

Hashes for model_registry-0.2.12-py3-none-any.whl
Algorithm Hash digest
SHA256 d87db3130d1ddd655b9a5e0c0ba69b23f8f852e29bc324b4dfcde72f823418d7
MD5 4dcf2712271920d1717d68c50f337ac2
BLAKE2b-256 5b9384c0da0df9ba84564004d8867a1208f401b1d8b63711ec3ebb5c6a22c933

See more details on using hashes here.

Provenance

The following attestation bundles were made for model_registry-0.2.12-py3-none-any.whl:

Publisher: python-release.yml on opendatahub-io/model-registry

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page