Skip to main content

A KServe Model Wrapper

Project description

kserve-helper

kserve-helper is a toolkit for building docker images for ML models built on KServe. It supports model input validation, uploading generated files to S3 or GCS, building model images, etc. Here are some basic examples.

Implement a Model Class for Serving

To build a docker image for serving, we only need to implement a model class with load and predict methods:

class Model:

    def load(self):
        # Load the model
        pass

    def predict(
            self,
            image: str = Input(
                description="Base64 encoded image",
                default=""
            ),
            radius: float = Input(
                description="Standard deviation of the Gaussian kernel",
                default=2
            )
    ) -> Path:
        if image == "":
            raise ValueError("The input image is not set")
        im_binary = base64.b64decode(image)
        input_image = Image.open(io.BytesIO(im_binary))
        output_image = input_image.filter(ImageFilter.GaussianBlur(radius))
        output_path = KServeModel.generate_filepath("image.jpg")
        output_image.save(output_path)
        return Path(output_path)

The load function will be called during the model initialization step, which will be only called once. The predict function will be called for each request. The input parameter information is specified by the Input class. This Input class allows us to set parameter descriptions, default values and constraints (e.g., 0 <= input value <= 1).

The output typing of the predict function is important. If the output type is Path or List[Path], the webhook for uploading files will be called after predict is finished. In this case, the input request should also contain an additional key "upload_webhook" to specify the webhook server address (an example). If the output type is not Path, the results will be returned directly without calling the webhook.

If streaming outputs are required, the output of predict should be an iterator:

class Model:

    def load(self):
        pass

    def predict(
            self,
            repeat: int = Input(
                description="The number of repeats",
                default=5
            )
    ):
        def _generator():
            for i in range(repeat):
                yield "Hello World!"
                time.sleep(1)

        return KServeModel.wrap_generator(_generator)

Note that we combine streaming and non-streaming APIs together as predict when using KServe >= 0.13.1. For KServe <= 0.10.2, we seperate streaming and non-streaming APIs, i.e.,

class Model:

    def load(self):
        pass

    def predict(
            self,
            repeat: int = Input(
                description="The number of repeats",
                default=5
            )
    ):
        time.sleep(repeat)
        return {"output": " ".join(["Hello World!"] * repeat)}

    def generate(
            self,
            repeat: int = Input(
                description="The number of repeats",
                default=5
            )
    ):
        def _generator():
            for i in range(repeat):
                yield "Hello World!"
                time.sleep(1)

        return KServeModel.wrap_generator(_generator)

Write a Config for Building Docker Image

To build the corresponding docker image for serving, we only need to write a config file:

build:
  python_version: "3.10"
  cuda: "11.7"

  # a list of commands (optional)
  commands:
    - "apt install -y software-properties-common"

  # a list of ubuntu apt packages to install (optional)
  system_packages:
    - "git"
    - "python3-opencv"

  # choose requirements.txt (optional)
  python_requirements:
    - "requirements.txt"

  # a list of packages in the format <package-name>==<version>
  python_packages:
    - "kservehelper>=1.1.0"
    - "salesforce_lavis-1.1.0-py3-none-any.whl"
    - "git+https://github.com/huggingface/diffusers.git"
    - "controlnet_aux==0.0.7"
    - "opencv-python==4.8.0.74"
    - "Pillow"
    - "tensorboard"
    - "mediapipe"
    - "accelerate"
    - "bitsandbytes"

# The name given to built Docker images
image: "<DOCKER-IMAGE-NAME:TAG>"

# model.py defines the entrypoint
entrypoint: "model.py"

In the config file, we can choose python version, cuda version (and whether to use NGC images), system packages and python packages. We need to set the docker image name and the entrypoint. The entrypoint is just the file that defines the model class above.

To build the docker image, we can simply run in the folder containing the config file:

kservehelper build .

To push the docker image, run this command:

kservehelper push .

For more details, please check the implementations in the repo.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

kservehelper-2.0.0.tar.gz (29.4 kB view details)

Uploaded Source

Built Distribution

kservehelper-2.0.0-py3-none-any.whl (36.8 kB view details)

Uploaded Python 3

File details

Details for the file kservehelper-2.0.0.tar.gz.

File metadata

  • Download URL: kservehelper-2.0.0.tar.gz
  • Upload date:
  • Size: 29.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for kservehelper-2.0.0.tar.gz
Algorithm Hash digest
SHA256 57a4888d0c92457cf7f95068b96dcfd902ee970f7fd8adf3d81a0725a58bdad0
MD5 69f2dc7df4d96d6f6269b6f7a8db7d20
BLAKE2b-256 eac53cbd7ad96737feffa13830c69cce35220688845968ed192cf83812b6a4ea

See more details on using hashes here.

File details

Details for the file kservehelper-2.0.0-py3-none-any.whl.

File metadata

File hashes

Hashes for kservehelper-2.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 a757e789499cc8e2e46bc4e2a724031b0cf9b0dc04ac4653cb699d0c313ccc49
MD5 dbb43336668f9c119944fa4280ee1a30
BLAKE2b-256 58ebcbf67ac2bc06358922b2f2f083f37cedadd66dd256c3aaf646f27aae4b72

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page