Azure Machine Learning Parallel Run Step - Deprecated
Project description
Note
This package has been deprecated and moved to azureml-pipeline-steps. Please refer to this documentation for more information.
Azure Machine Learning Batch Inference
Azure Machine Learning Batch Inference targets large inference jobs that are not time-sensitive. Batch Inference provides cost-effective inference compute scaling, with unparalleled throughput for asynchronous applications. It is optimized for high-throughput, fire-and-forget inference over large collections of data.
Getting Started with Batch Inference Public Preview
Batch inference public preview offers a platform in which to do large inference or generic parallel map-style operations. Please visit Azure Machine Learning Notebooks to find tutorials on how to leverage this service.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
Built Distribution
Hashes for azureml_contrib_pipeline_steps-1.45.0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 24ca88da0734e4735bd992b03f7b6311e477a42d667ea456709ec81f807f78b0 |
|
MD5 | 943b92c9fd85bf5107721bb5112f08f6 |
|
BLAKE2b-256 | 269cfbcb5c2b987fa39d1396f2d07b6466ce23022c652fd011f160a48327d6f3 |