MLflow plugin adding a Knative deployment target
Project description
MLflow Knative Deployment Plugin
MLflow plugin adding a Knative deployment client to MLflow CLI and Python API.
Note: MLServer (V2 Inference API) is enabled for all Docker builds.
Requirements
- Python 3.10+
- MLflow 2+
- Docker
The target Kubernetes cluster must be running Knative 1.10+.
Installation
pip install mlflow-knative
Getting Started
A Kubernetes context is required to define the Knative target.
You can list available contexts with kubectl config get-contexts
.
Make sure Docker is running locally if you intend create or update a deployment, as this is required to build an image from the MLflow model.
The plugin adds support for a knative
target scheme to the mlflow deployments
CLI.
Setting the image_repository
config key is required to make the Docker image of the model available for deployment by Knative. Additionally you may also provide an image tag with the image_tag
config key (defaults to latest
).
mlflow deployments create \
--target knative:/<context> \
--name <deployment-name> \
--model-uri models:/<model-name>/<model-version> \
--config image_repository=<image-repository-URI> \
--config image_tag=<image-tag>
The plugin provides detailed target help.
mlflow deployments help --target knative
All features are also available as a Python API deployment client.
from mlflow.deployments import get_deploy_client
client = get_deploy_client("knative:/my-cluster")
client.create_deployment(
"hello-world",
"models:/hello-world/1",
config={
"image_repository": "hello-world"
}
)
Using a Private Image Repository
To use a private Docker image repository, simply run docker login
defore running the deployment client, then use the full repository URI as value for the image_repository
config key.
docker login --username <username> --password-stdin <private-repository-URL>
# If using AWS ECR:
aws ecr get-login-password | docker login --username AWS --password-stdin <private-repository-URL>
mlflow deployments create \
--target knative:/<context> \
--name <deployment-name> \
--model-uri models:/<model-name>/<model-version> \
--config image_repository=<image-repository-URI> # e.g.: 000000000000.dkr.ecr.eu-west-3.amazonaws.com/model-name
Using a Remote MLflow Model Registry
Set an environment variable as export MLFLOW_TRACKING_URI=<tracking-server-uri>
to use a remote MLflow model registry.
This also works with a private model registry secured with OAuth 2, using the MLflow OIDC Client Plugin.
Knative Service Configuration
The deployment client can use any available namespace on the target cluster by setting the namespace
config key. The default value is default
.
mlflow deployments create \
--target knative:/<context> \
--name <deployment-name> \
--model-uri models:/<model-name>/<model-version> \
--config image_repository=<image-repository-URI> \
--config namespace=<my-namespace>
To deploy a Knative service with a custom templated manifest, set the service_template
config key. The value is a path to the YAML manifest you will be using.
mlflow deployments create \
--target knative:/<context> \
--name <deployment-name> \
--model-uri models:/<model-name>/<model-version> \
--config image_repository=<image-repository-URI> \
--config service_template=<path/to/manifest>
$name
, $namespace
and $image
templated values are respectively the deployment name, the provided namespace (or "default"), the image determined from the provided image repository and tag.
License
This project is licensed under the terms of the MIT license.
A yzr Free and Open Source project.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for mlflow_knative-0.1.3-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 69830fe208274a32dfa784cb1fe3272571740f0c2af0dbd7d1602501e7870f28 |
|
MD5 | 81a08843e51298d169fb0e4f33af7fbd |
|
BLAKE2b-256 | e01507d92519edeaef527dec817a284af81ca44be4f4a0c514338c991b07b29b |