Skip to main content

Microsoft Azure Machine Learning Client Library for Python

Project description

Azure ML Package client library for Python

We are excited to introduce the GA of Azure Machine Learning Python SDK v2. The Python SDK v2 introduces new SDK capabilities like standalone local jobs, reusable components for pipelines and managed online/batch inferencing. Python SDK v2 allows you to move from simple to complex tasks easily and incrementally. This is enabled by using a common object model which brings concept reuse and consistency of actions across various tasks. The SDK v2 shares its foundation with the CLI v2 which is also GA.

Source code | Package (PyPI) | API reference documentation | Product documentation | Samples

This package has been tested with Python 3.7, 3.8, 3.9 and 3.10.

For a more complete set of Azure libraries, see https://aka.ms/azsdk/python/all

Getting started

Prerequisites

Install the package

Install the Azure ML client library for Python with pip:

pip install azure-ai-ml

Authenticate the client

from azure.ai.ml import MLClient
from azure.identity import DefaultAzureCredential

ml_client = MLClient(
    DefaultAzureCredential(), subscription_id, resource_group, workspace
)

Key concepts

Azure Machine Learning Python SDK v2 comes with many new features like standalone local jobs, reusable components for pipelines and managed online/batch inferencing. The SDK v2 brings consistency and ease of use across all assets of the platform. The Python SDK v2 offers the following capabilities:

  • Run Standalone Jobs - run a discrete ML activity as Job. This job can be run locally or on the cloud. We currently support the following types of jobs:
    • Command - run a command (Python, R, Windows Command, Linux Shell etc.)
    • Sweep - run a hyperparameter sweep on your Command
  • Run multiple jobs using our improved Pipelines
    • Run a series of commands stitched into a pipeline (New)
    • Components - run pipelines using reusable components (New)
  • Use your models for Managed Online inferencing (New)
  • Use your models for Managed batch inferencing
  • Manage AML resources – workspace, compute, datastores
  • Manage AML assets - Datasets, environments, models
  • AutoML - run standalone AutoML training for various ml-tasks:
    • Classification (Tabular data)
    • Regression (Tabular data)
    • Time Series Forecasting (Tabular data)
    • Image Classification (Multi-class) (New)
    • Image Classification (Multi-label) (New)
    • Image Object Detection (New)
    • Image Instance Segmentation (New)
    • NLP Text Classification (Multi-class) (New)
    • NLP Text Classification (Multi-label) (New)
    • NLP Text Named Entity Recognition (NER) (New)

Examples

Troubleshooting

General

Azure ML clients raise exceptions defined in Azure Core.

from azure.core.exceptions import HttpResponseError

try:
    ml_client.compute.get("cpu-cluster")
except HttpResponseError as error:
    print("Request failed: {}".format(error.message))

Logging

This library uses the standard logging library for logging. Basic information about HTTP sessions (URLs, headers, etc.) is logged at INFO level.

Detailed DEBUG level logging, including request/response bodies and unredacted headers, can be enabled on a client with the logging_enable argument.

See full SDK logging documentation with examples here.

Telemetry

The Azure ML Python SDK includes a telemetry feature that collects usage and failure data about the SDK and sends it to Microsoft when you use the SDK in a Jupyter Notebook only. Telemetry will not be collected for any use of the Python SDK outside of a Jupyter Notebook.

Telemetry data helps the SDK team understand how the SDK is used so it can be improved and the information about failures helps the team resolve problems and fix bugs. The SDK telemetry feature is enabled by default for Jupyter Notebook usage. To opt out of the telemetry feature, set the AZUREML_SDKV2_TELEMETRY_OPTOUT environment variable to '1' or 'true'.

Next steps

Contributing

This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit cla.microsoft.com.

When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.

Release History

1.1.2 (2022-11-21)

Features Added

  • Restored idle_time_before_shutdown property for Compute Instances.
  • Deprecated idle_time_before_shutdown property in favor of idle_time_before_shutdown_minutes.

Bugs Fixed

  • Fixed idle_time_before_shutdown appearing as None for Compute Instances returned by show or list.
  • Fixed idle_time_before_shutdown_minutes preventing creation of Compute Instances when set to None.

1.1.1 (2022-11-15)

Breaking Changes

  • Renamed idle_time_before_shutdown to idle_time_before_shutdown_minutes and changed input type to int.

Bugs Fixed

  • Fixed idle_time_before_shutdown_minutes not appearing in GET calls for Compute Instances.

1.1.0 (2022-11-07)

Features Added

  • Registry list operation now accepts scope value to allow subscription-only based requests.
  • Most configuration classes from the entity package now implement the standard mapping protocol.
  • Add registry delete operation.
  • The values of JobService.job_service_type are now using the snake case. e.g jupyter_lab, ssh, tensor_board, vs_code.
  • Command function now accepts services param of type Dict[str, JobService] instead of dict.

Bugs Fixed

  • MLClient.from_config can now find the default config.json on Compute Instance when running sample notebooks.
  • Fixed job inputs not accepting datastores or job inputs.
  • Registries now assign managed tags to match registry's tags.
  • Adjust registry experimental tags and imports to avoid warning printouts for unrelated operations.
  • Make registry delete operation return an LROPoller, and change name to begin_delete.
  • Prevent registering an already existing environment that references conda file.

Other Changes

  • Switched compute operations to go through 2022-10-01-preview API version.

1.0.0 (2022-10-10)

  • GA release
  • Dropped support for Python 3.6. The Python versions supported for this release are 3.7-3.10.

Features Added

Breaking Changes

  • OnlineDeploymentOperations.delete has been renamed to begin_delete.
  • Datastore credentials are switched to use unified credential configuration classes.
  • UserAssignedIdentity is replaced by ManagedIdentityConfiguration
  • Endpoint and Job use unified identity classes.
  • Workspace ManagedServiceIdentity has been replaced by IdentityConfiguration.

Other Changes

  • Switched Compute operations to use Oct preview API version.
  • Updated batch deployment/endpoint invoke and list-jobs function signatures with curated BatchJob class.

0.1.0b8 (2022-10-07)

Features Added

  • Support passing JobService as argument to Command()
  • Added support for custom setup scripts on compute instances.
  • Added a show_progress parameter to MLClient for enable/disable progress bars of long running operations.
  • Support month_days in RecurrencePattern when using RecurrenceSchedule.
  • Support ml_client.schedules.list with list_view_type, default to ENABLED_ONLY.
  • Add support for model sweeping and hyperparameter tuning in AutoML NLP jobs.
  • Added ml_client.jobs.show_services() operation.

Breaking Changes

  • ComputeOperations.attach has been renamed to begin_attach.
  • Deprecated parameter path has been removed from load and dump methods.
  • JobOperations.cancel() is renamed to JobOperations.begin_cancel() and it returns LROPoller
  • Workspace.list_keys renamed to Workspace.get_keys.

Bugs Fixed

  • Fix identity passthrough job with single file code

Other Changes

  • Removed declaration on Python 3.6 support
  • Added support for custom setup scripts on compute instances.
  • Updated dependencies upper bounds to be major versions.

0.1.0b7 (2022-09-22)

Features Added

  • Spark job submission.
  • Command and sweep job docker config (shmSize and dockerArgs) spec support.
  • Entity load and dump now also accept a file pointer as input.
  • Load and dump input names changed from path to 'source' and 'dest', respectively.
  • Load and dump 'path' input still works, but is deprecated and emits a warning.
  • Managed Identity Support for Compute Instance (experimental).
  • Enable using @dsl.pipeline without brackets when no additional parameters.
  • Expose Azure subscription Id and resource group name from MLClient objects.
  • Added Idle Shutdown support for Compute Instances, allowing instances to shutdown after a set period of inactivity.
  • Online Deployment Data Collection for eventhub and data storage will be supported.
  • Syntax validation on scoring scripts of Batch Deployment and Online Deployment will prevent the user from submitting bad deployments.

Breaking Changes

  • Change (begin_)create_or_update typehints to use generics.
  • Remove invalid option from create_or_update typehints.
  • Change error returned by (begin_)create_or_update invalid input to TypeError.
  • Rename set_image_model APIs for all vision tasks to set_training_parameters
  • JobOperations.download defaults to "." instead of Path.cwd()

Bugs Fixed

Other Changes

  • Show 'properties' on data assets

0.1.0b6 (2022-08-09)

Features Added

  • Support for AutoML Component
  • Added skip_validation for Job/Component create_or_update

Breaking Changes

  • Dataset removed from public interface.

Bugs Fixed

  • Fixed mismatch errors when updating scale_settings for KubernetesOnlineDeployment.
  • Removed az CLI command that was printed when deleting OnlineEndpoint

0.1.0b5 (2022-07-15)

Features Added

  • Allow Input/Output objects to be used by CommandComponent.
  • Added MoonCake cloud support.
  • Unified inputs/outputs building and validation logic in BaseNode.
  • Allow Git repo URLs to be used as code for jobs and components.
  • Updated AutoML YAML schema to use InputSchema.
  • Added end_time to job schedule.
  • MIR and pipeline job now support registry assets.

Bugs Fixed

  • Have mldesigner use argparser to parse incoming args.
  • Bumped pyjwt version to <3.0.0.
  • Reverted "upload support for symlinks".
  • Error message improvement when a YAML UnionField fails to match.
  • Reintroduced support for symlinks when uploading.
  • Hard coded registry base URL to eastus region to support preview.

0.1.0b4 (2022-06-16)

0.1.0b3 (2022-05-24)

Features Added

  • First preview.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

azure-ai-ml-1.1.2.zip (4.6 MB view details)

Uploaded Source

Built Distribution

azure_ai_ml-1.1.2-py3-none-any.whl (4.0 MB view details)

Uploaded Python 3

File details

Details for the file azure-ai-ml-1.1.2.zip.

File metadata

  • Download URL: azure-ai-ml-1.1.2.zip
  • Upload date:
  • Size: 4.6 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.28.1 setuptools/58.1.0 requests-toolbelt/0.10.1 tqdm/4.64.1 CPython/3.9.15

File hashes

Hashes for azure-ai-ml-1.1.2.zip
Algorithm Hash digest
SHA256 b688434ef2865e3116ea191041425570d97acb0710b95915e0be605f0676478f
MD5 7a8d0418c88ca0ac018e73417f06f8ab
BLAKE2b-256 0344a603abdb664397ce72f458145a50e6fa9690b2319f6545bbbbf594ab98c2

See more details on using hashes here.

File details

Details for the file azure_ai_ml-1.1.2-py3-none-any.whl.

File metadata

  • Download URL: azure_ai_ml-1.1.2-py3-none-any.whl
  • Upload date:
  • Size: 4.0 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.28.1 setuptools/58.1.0 requests-toolbelt/0.10.1 tqdm/4.64.1 CPython/3.9.15

File hashes

Hashes for azure_ai_ml-1.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 cb5bc960efe3a558b0825671ef0a318774daccbacd98e4a6f41bcd33d81a474e
MD5 3650f1810f45e069bb2bc00df751d6a0
BLAKE2b-256 ec7021a4994e23d9dd3ba6351f99d8c9092353282d616184ab43879b2317fe4c

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page