Skip to main content

Azure Machine Learning Parallel Run Step - Deprecated

Project description

Note

This package has been deprecated and moved to azureml-pipeline-steps. Please refer to this documentation for more information.

Azure Machine Learning Batch Inference

Azure Machine Learning Batch Inference targets large inference jobs that are not time-sensitive. Batch Inference provides cost-effective inference compute scaling, with unparalleled throughput for asynchronous applications. It is optimized for high-throughput, fire-and-forget inference over large collections of data.

Getting Started with Batch Inference Public Preview

Batch inference public preview offers a platform in which to do large inference or generic parallel map-style operations. Please visit Azure Machine Learning Notebooks to find tutorials on how to leverage this service.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

azureml_contrib_pipeline_steps-1.62.0-py3-none-any.whl (17.7 kB view details)

Uploaded Python 3

File details

Details for the file azureml_contrib_pipeline_steps-1.62.0-py3-none-any.whl.

File metadata

File hashes

Hashes for azureml_contrib_pipeline_steps-1.62.0-py3-none-any.whl
Algorithm Hash digest
SHA256 00cf94e41dc13a7917c0194cd2bc1fe9573a9da357903112d9a5fce2a3f3b0e5
MD5 68c874f2a398564e00d56769851a15c4
BLAKE2b-256 e40ac349cccc2924edebe6fde373e5882eda568f436f15831c3d4a1156b159e1

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page