Skip to main content

mlctl is the control plane for MLOps. It provides a CLI and a Python SDK for supporting key operations related to MLOps.

Project description

mlctl

mlctl logo

mlctl is the Command Line Interface (CLI)/Software Development Kit (SDK) for MLOps. It allows for all ML Lifecycle operations, such as Training, Deployment etc. to be controlled via a simple-to-use command line interface. Additionally, mlctl provides a SDK for use in a notebook environment and employs an extensible mechanism for plugging in various back-end providers, such as SageMaker.

The following ML Lifecycle operations are currently supported via mlctl

  • train - operations related to model training
  • host - operations related to hosting a model for online inference
  • batch inference - operations for running model inference in a batch method

Getting Started

Installation

  1. (Optional) Create a new virtual environment for mlctl

    pip install virtualenv
    virtualenv ~/envs/mlctl
    source ~/envs/mlctl/bin/activate
    
  2. Install mlctl:

    pip install mlctl
    
  3. Upgrade an existing version:

    pip install --upgrade mlctl
    

Usage

Optional Setup

mlctl requires users to specify the plugin and a profile/credentials file for authenticating operations. These values can either be stored as environment variables as shown below OR they can be passed as command line options. Use --help for more details.

```
export PLUGIN=
export PROFILE=
```

Commands

mlctl CLI commands have the following structure:

mlctl <command> <subcommand> [OPTIONS]

To view help documentation, run the following:

mlctl --help
mlctl <command> --help
mlctl <command> <subcommand> --help

Initialize ML Model


mlctl init [OPTIONS]
Options Description
template or -t (optional) Location of the project template github location.

Training Commands


mlctl train <subcommand> [OPTIONS]
Subcommand Description
start train a model
stop stop an ongoing training job
info get training job information

Hosting Commands


mlctl hosting <subcommand> [OPTIONS]
Subcommand Description
create create a model from trained model artifact
deploy deploy a model to create an endpoint for inference
undeploy undeploy a model
info get endpoint information

Batch Inference Commands


mlctl batch <subcommand> [OPTIONS]
Subcommand Description
start perform batch inference
stop stop an ongoing batch inference
info get batch inference information

Examples

Contributing

For information on how to contribute to mlctl, please read through the contributing guidelines.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mlctl-0.0.5.tar.gz (36.7 kB view details)

Uploaded Source

Built Distribution

mlctl-0.0.5-py2.py3-none-any.whl (22.7 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file mlctl-0.0.5.tar.gz.

File metadata

  • Download URL: mlctl-0.0.5.tar.gz
  • Upload date:
  • Size: 36.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.2 importlib_metadata/4.6.4 pkginfo/1.7.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.1 CPython/3.9.6

File hashes

Hashes for mlctl-0.0.5.tar.gz
Algorithm Hash digest
SHA256 8ed33d16608eced1bffef7015da2a0b63032f55caa56363bff714993b17c659f
MD5 22e199f152e32e275c4bb0ea9ee502d3
BLAKE2b-256 a01fc491a19baf3587d1b527bef8a4fdc43a94aee3fffa82918c3949e7eff6c6

See more details on using hashes here.

File details

Details for the file mlctl-0.0.5-py2.py3-none-any.whl.

File metadata

  • Download URL: mlctl-0.0.5-py2.py3-none-any.whl
  • Upload date:
  • Size: 22.7 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.2 importlib_metadata/4.6.4 pkginfo/1.7.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.1 CPython/3.9.6

File hashes

Hashes for mlctl-0.0.5-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 a3811f4b4c48b63b0d52ec4534af102f52981a03b781ad0795e191256d244618
MD5 a742417e84c5ff2eda579b4f5c03e301
BLAKE2b-256 adea3039b8d227f11f1393bfbc8a344fa074f6fb653f41d063a5f4ad6579cd53

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page