Skip to main content

A KServe Model Wrapper

Project description

kserve-helper

This is a helper for building docker images for ML models. Here are some basic examples. For more examples, please visit this repo.

Implement a Model Class for Serving

To build a docker image for serving, we only need to implement one class with load and predict methods:

class Model:

    def load(self):
        # Load the model
        pass

    def predict(
            self,
            image: str = Input(
                description="Base64 encoded image",
                default=""
            ),
            radius: float = Input(
                description="Standard deviation of the Gaussian kernel",
                default=2
            )
    ) -> Path:
        if image == "":
            raise ValueError("The input image is not set")
        im_binary = base64.b64decode(image)
        input_image = Image.open(io.BytesIO(im_binary))
        output_image = input_image.filter(ImageFilter.GaussianBlur(radius))
        output_path = KServeModel.generate_filepath("image.jpg")
        output_image.save(output_path)
        return Path(output_path)

The load function will be called during the initialization step, which will be only called once. The predict function will be called for each request. The input parameter info is specified by the Input class. This Input class allows us to set parameter descriptions, default value and constraints (e.g., 0 <= input value <= 1).

The output typing of the predict function is important. If the output type is Path or List[Path], the webhook for uploading will be called after predict is finished. In this case, the input request should also contain an additional key "upload_webhook" to specify the webhook server address (an example). If the output type is not Path, the results will be returned directly without calling the webhook.

Write a Config for Building Docker Image

To build the corresponding docker image for serving, we only need to write a config file:

build:
  python_version: "3.10"
  cuda: "11.7"

  # a list of commands (optional)
  commands:
    - "apt install -y software-properties-common"

  # a list of ubuntu apt packages to install (optional)
  system_packages:
    - "git"
    - "python3-opencv"

  # choose requirements.txt (optional)
  python_requirements:
    - "requirements.txt"

  # a list of packages in the format <package-name>==<version>
  python_packages:
    - "kservehelper>=1.1.0"
    - "salesforce_lavis-1.1.0-py3-none-any.whl"
    - "git+https://github.com/huggingface/diffusers.git"
    - "controlnet_aux==0.0.7"
    - "opencv-python==4.8.0.74"
    - "Pillow"
    - "tensorboard"
    - "mediapipe"
    - "accelerate"
    - "bitsandbytes"

# The name given to built Docker images
image: "<DOCKER-IMAGE-NAME:TAG>"

# model.py defines the entrypoint
entrypoint: "model.py"

In the config file, we can choose python version, cuda version (and whether to use NGC images), system packages and python packages. We need to set the docker image name and the entrypoint. The entrypoint is just the file that defines the model class above.

To build the docker image, we can simply run in the folder containing the config file:

kservehelper build .

To push the docker image, run this command:

kservehelper push .

For more details, please check the implementations in the repo.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

kservehelper-1.2.2.tar.gz (27.6 kB view details)

Uploaded Source

Built Distribution

kservehelper-1.2.2-py3-none-any.whl (35.1 kB view details)

Uploaded Python 3

File details

Details for the file kservehelper-1.2.2.tar.gz.

File metadata

  • Download URL: kservehelper-1.2.2.tar.gz
  • Upload date:
  • Size: 27.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.12.2

File hashes

Hashes for kservehelper-1.2.2.tar.gz
Algorithm Hash digest
SHA256 94f3ce5b7c42884e2be7a3a621bce772f4784fd6fe74f0f55a857536ac2b9d96
MD5 69d5d7db9a68c75b2e6a3a67c1e23c60
BLAKE2b-256 ccd670b096b0087a34d58b58d6234c86835652d00b8ea69f28e401ef7e94089c

See more details on using hashes here.

File details

Details for the file kservehelper-1.2.2-py3-none-any.whl.

File metadata

File hashes

Hashes for kservehelper-1.2.2-py3-none-any.whl
Algorithm Hash digest
SHA256 8c08989f57416388f23ff08fe266b2d7dfe9cd9eeb883926fff10cead4ec1a9b
MD5 d95ad1b2e0b20f42921b3f70579d2fe7
BLAKE2b-256 bab5c61d09230160e4f0c0fced1e10da0e25a3c0c78318649fc15fc53a6cc8b6

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page