Skip to main content

Torch Serve Mlflow Deployment

Project description

# mlflow-torchserve

A plugin that integrates [TorchServe](https://github.com/pytorch/serve) with MLflow pipeline. mlflow_torchserve enables you to use mlflow to deploy the models built and trained in mlflow pipeline into TorchServe without any extra effort from the user. This plugin provides few command line APIs, which is also accessible through mlflow’s python package, to make the deployment process seamless.

## Installation For installing and activating the plugin, you only need to install this package which is available in pypi and can be installed with

`bash pip install mlflow-torchserve ` ## What does it do Installing this package uses python’s entrypoint mechanism to register the plugin into MLflow’s plugin registry. This registry will be invoked each time you launch MLflow script or command line argument.

### Create deployment Deploy the model to TorchServe. The create command line argument and create_deployment python APIs does the deployment of a model built with MLflow to TorchServe.

##### CLI `shell script mlflow deployments create -t torchserve -m <model uri> --name DEPLOYMENT_NAME -C 'MODEL_FILE=<model file path>' -C 'HANDLER=<handler file path>' `

##### Python API `python from mlflow.deployments import get_deploy_client target_uri = 'torchserve' plugin = get_deploy_client(target_uri) plugin.create_deployment(name=<deployment name>, model_uri=<model uri>, config={"MODEL_FILE": <model file path>, "HANDLER": <handler file path>}) `

### Update deployment Update API can be used to update an already deployed model. This setup is useful if you want to increase the number of workers or set a model as default version. TorchServe will make sure the user experience is seamless while changing the model in a live environment.

##### CLI `shell script mlflow deployments update -t torchserve --name <deployment name> -C "min-worker=<number of workers>" `

##### Python API `python plugin.update_deployment(name=<deployment name>, config={'min-worker': <number of workers>}) `

### Delete deployment Delete an existing deployment. Error will be thrown if the model is not already deployed

##### CLI `shell script mlflow deployments delete -t torchserve --name <deployment name / version number> `

##### Python API `python plugin.delete_deployment(name=<deployment name / version number>) `

### List all deployments List the names of all the deploymented models. This name can then be used in other APIs or can be used in the get deployment API to get more details about a particular deployment.

##### CLI `shell script mlflow deployments list -t torchserve `

##### Python API `python plugin.list_deployments() `

### Get deployment details Get API fetches the details of the deployed model. By default, Get API fetches all the versions of the deployed model

##### CLI `shell script mlflow deployments get -t torchserve --name <deployment name> `

##### Python API `python plugin.get_deployment(name=<deployment name>) `

### Run Prediction on deployed model Predict API enables to run prediction on the deployed model.

CLI takes json file path as input. However, input to the python plugin can be one among the three types DataFrame, Tensor or Json String.

##### CLI `shell script mlflow deployments predict -t torchserve --name <deployment name> --input-path <input file path> --output-path <output file path> `

output-path is an optional parameter. Without output path parameter result will printed in console.

##### Python API `python plugin.predict(name=<deployment name>, df=<prediction input>) `

### Plugin help Run the following command to get the plugin help string.

##### CLI `shell script mlflow deployments help -t torchserve `

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mlflow-torchserve-0.0.2.tar.gz (12.7 kB view details)

Uploaded Source

Built Distribution

mlflow_torchserve-0.0.2-py3-none-any.whl (18.1 kB view details)

Uploaded Python 3

File details

Details for the file mlflow-torchserve-0.0.2.tar.gz.

File metadata

  • Download URL: mlflow-torchserve-0.0.2.tar.gz
  • Upload date:
  • Size: 12.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.6.1 requests/2.24.0 setuptools/50.3.0.post20201006 requests-toolbelt/0.9.1 tqdm/4.51.0 CPython/3.8.5

File hashes

Hashes for mlflow-torchserve-0.0.2.tar.gz
Algorithm Hash digest
SHA256 a85a6a5e65d562656e3a90776429734de4819bbeed288075115bd53bbb0c2f33
MD5 7043c034f21e5c581142d4be80d31ae5
BLAKE2b-256 05a1d5a932358cd4600ecd130ab2a0b3afe587d2c672e1bdbe324bd5714fa439

See more details on using hashes here.

File details

Details for the file mlflow_torchserve-0.0.2-py3-none-any.whl.

File metadata

  • Download URL: mlflow_torchserve-0.0.2-py3-none-any.whl
  • Upload date:
  • Size: 18.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.6.1 requests/2.24.0 setuptools/50.3.0.post20201006 requests-toolbelt/0.9.1 tqdm/4.51.0 CPython/3.8.5

File hashes

Hashes for mlflow_torchserve-0.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 98ecf80968cda9b71b15d5d9a65690ca7f59e67ada0b0a2ea14c4b31ed18d9fe
MD5 04ff89afbba4284e29e587e98061b78c
BLAKE2b-256 dbf9be8d8110f4642780e3794f400d96493c6032430cc3977fbdde6019b6f51f

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page