Azure Machine Learning Parallel Run Step - Deprecated
Project description
Note
This package has been deprecated and moved to azureml-pipeline-steps. Please refer to this documentation for more information.
Azure Machine Learning Batch Inference
Azure Machine Learning Batch Inference targets large inference jobs that are not time-sensitive. Batch Inference provides cost-effective inference compute scaling, with unparalleled throughput for asynchronous applications. It is optimized for high-throughput, fire-and-forget inference over large collections of data.
Getting Started with Batch Inference Public Preview
Batch inference public preview offers a platform in which to do large inference or generic parallel map-style operations. Please visit Azure Machine Learning Notebooks to find tutorials on how to leverage this service.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
Built Distribution
Hashes for azureml_contrib_pipeline_steps-1.50.0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 6763d2fd87bf104d064c18b99116edf79f57c4a9282a9849e577962cac4ec856 |
|
MD5 | a4f66b6d17969c2f9a71d47d74254a9c |
|
BLAKE2b-256 | e6aa6eec896c199e3c749ae6a0d99c2d9ca61f5556d0c7eb77aa6171a63075e7 |