Skip to main content

Microsoft Azure Machine Learning Client Library for Python

Project description

Azure ML Package client library for Python

We are excited to introduce the public preview of Azure Machine Learning Python SDK v2. The Python SDK v2 introduces new SDK capabilities like standalone local jobs, reusable components for pipelines and managed online/batch inferencing. Python SDK v2 allows you to move from simple to complex tasks easily and incrementally. This is enabled by using a common object model which brings concept reuse and consistency of actions across various tasks. The SDK v2 shares its foundation with the CLI v2 which is currently in also in public preview.

Source code | Package (PyPI) | API reference documentation | Product documentation | Samples

This package has been tested with Python 3.6, 3.7, 3.8, 3.9 and 3.10.

For a more complete set of Azure libraries, see https://aka.ms/azsdk/python/all

Getting started

Prerequisites

Install the package

Install the Azure ML client library for Python with pip:

pip install azure-ai-ml

Authenticate the client

from azure.ai.ml import MLClient
from azure.identity import DefaultAzureCredential

ml_client = MLClient(
    DefaultAzureCredential(), subscription_id, resource_group, workspace
)

Key concepts

Azure Machine Learning Python SDK v2 comes with many new features like standalone local jobs, reusable components for pipelines and managed online/batch inferencing. The SDK v2 brings consistency and ease of use across all assets of the platform. The Python SDK v2 offers the following capabilities:

  • Run Standalone Jobs - run a discrete ML activity as Job. This job can be run locally or on the cloud. We currently support the following types of jobs:
    • Command - run a command (Python, R, Windows Command, Linux Shell etc.)
    • Sweep - run a hyperparameter sweep on your Command
  • Run multiple jobs using our improved Pipelines
    • Run a series of commands stitched into a pipeline (New)
    • Components - run pipelines using reusable components (New)
  • Use your models for Managed Online inferencing (New)
  • Use your models for Managed batch inferencing
  • Manage AML resources – workspace, compute, datastores
  • Manage AML assets - Datasets, environments, models
  • AutoML - run standalone AutoML training for various ml-tasks:
    • Classification (Tabular data)
    • Regression (Tabular data)
    • Time Series Forecasting (Tabular data)
    • Image Classification (Multi-class) (New)
    • Image Classification (Multi-label) (New)
    • Image Object Detection (New)
    • Image Instance Segmentation (New)
    • NLP Text Classification (Multi-class) (New)
    • NLP Text Classification (Multi-label) (New)
    • NLP Text Named Entity Recognition (NER) (New)

Examples

Troubleshooting

General

Azure ML clients raise exceptions defined in Azure Core.

from azure.core.exceptions import HttpResponseError

try:
    ml_client.compute.get("cpu-cluster")
except HttpResponseError as error:
    print("Request failed: {}".format(error.message))

Logging

This library uses the standard logging library for logging. Basic information about HTTP sessions (URLs, headers, etc.) is logged at INFO level.

Detailed DEBUG level logging, including request/response bodies and unredacted headers, can be enabled on a client with the logging_enable argument.

See full SDK logging documentation with examples here.

Telemetry

The Azure ML Python SDK includes a telemetry feature that collects usage and failure data about the SDK and sends it to Microsoft when you use the SDK. Telemetry data helps the SDK team understand how the SDK is used so it can be improved and the information about failures helps the team resolve problems and fix bugs. The SDK telemetry feature is enabled by default. To opt out of the telemetry feature, set the AZUREML_SDKV2_TELEMETRY_OPTOUT environment variable to 1 or true.

Next steps

Contributing

This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit cla.microsoft.com.

When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.

Release History

0.1.0b8 (In Progress)

Features Added

  • Support passing JobService as argument to Command()
  • Added support for custom setup scripts on compute instances.
  • Added a show_progress parameter to MLClient for enable/disable progress bars of long running operations.
  • Support month_days in RecurrencePattern when using RecurrenceSchedule.
  • Support ml_client.schedules.list with list_view_type, default to ENABLED_ONLY.

Breaking Changes

  • ComputeOperations.attach has been renamed to begin_attach.
  • Deprecated parameter path has been removed from load and dump methods.
  • JobOperations.cancel() is renamed to JobOperations.begin_cancel() and it returns LROPoller
  • Workspace.list_keys renamed to Workspace.get_keys.

Bugs Fixed

  • Fix identity passthrough job with single file code

Other Changes

  • Removed declaration on Python 3.6 support
  • Added support for custom setup scripts on compute instances.
  • Updated dependencies upper bounds to be major versions.

0.1.0b7 (2022-09-22)

Features Added

  • Spark job submission.
  • Command and sweep job docker config (shmSize and dockerArgs) spec support.
  • Entity load and dump now also accept a file pointer as input.
  • Load and dump input names changed from path to 'source' and 'dest', respectively.
  • Load and dump 'path' input still works, but is deprecated and emits a warning.
  • Managed Identity Support for Compute Instance (experimental).
  • Enable using @dsl.pipeline without brackets when no additional parameters.
  • Expose Azure subscription Id and resource group name from MLClient objects.
  • Added Idle Shutdown support for Compute Instances, allowing instances to shutdown after a set period of inactivity.
  • Online Deployment Data Collection for eventhub and data storage will be supported.
  • Syntax validation on scoring scripts of Batch Deployment and Online Deployment will prevent the user from submitting bad deployments.

Breaking Changes

  • Change (begin_)create_or_update typehints to use generics.
  • Remove invalid option from create_or_update typehints.
  • Change error returned by (begin_)create_or_update invalid input to TypeError.
  • Rename set_image_model APIs for all vision tasks to set_training_parameters
  • JobOperations.download defaults to "." instead of Path.cwd()

Bugs Fixed

Other Changes

  • Show 'properties' on data assets

0.1.0b6 (2022-08-09)

Features Added

  • Support for AutoML Component
  • Added skip_validation for Job/Component create_or_update

Breaking Changes

  • Dataset removed from public interface.

Bugs Fixed

  • Fixed mismatch errors when updating scale_settings for KubernetesOnlineDeployment.
  • Removed az CLI command that was printed when deleting OnlineEndpoint

Other Changes

0.1.0b5 (2022-07-15)

Features Added

  • Allow Input/Output objects to be used by CommandComponent.
  • Added MoonCake cloud support.
  • Unified inputs/outputs building and validation logic in BaseNode.
  • Allow Git repo URLs to be used as code for jobs and components.
  • Updated AutoML YAML schema to use InputSchema.
  • Added end_time to job schedule.
  • MIR and pipeline job now support registry assets.

Breaking Changes

Bugs Fixed

  • Have mldesigner use argparser to parse incoming args.
  • Bumped pyjwt version to <3.0.0.
  • Reverted "upload support for symlinks".
  • Error message improvement when a YAML UnionField fails to match.
  • Reintroduced support for symlinks when uploading.
  • Hard coded registry base URL to eastus region to support preview.

0.1.0b4 (2022-06-16)

0.1.0b3 (2022-05-24)

Features Added

  • First preview.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

azure-ai-ml-0.1.0b8.zip (4.4 MB view details)

Uploaded Source

Built Distribution

azure_ai_ml-0.1.0b8-py3-none-any.whl (3.9 MB view details)

Uploaded Python 3

File details

Details for the file azure-ai-ml-0.1.0b8.zip.

File metadata

  • Download URL: azure-ai-ml-0.1.0b8.zip
  • Upload date:
  • Size: 4.4 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.28.1 setuptools/58.1.0 requests-toolbelt/0.10.0 tqdm/4.64.1 CPython/3.9.14

File hashes

Hashes for azure-ai-ml-0.1.0b8.zip
Algorithm Hash digest
SHA256 afdae22f13bb99a89f404b7a77c3f7a0e479c29b9a629c7c43b92c6e00b0fc4d
MD5 adb498a98f5ccedbef639e50c9320efb
BLAKE2b-256 7f8ab1a94ae4f1b4f95a433b1945069f0bd9c1939452435c98a9f8bacdde7cff

See more details on using hashes here.

File details

Details for the file azure_ai_ml-0.1.0b8-py3-none-any.whl.

File metadata

  • Download URL: azure_ai_ml-0.1.0b8-py3-none-any.whl
  • Upload date:
  • Size: 3.9 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.28.1 setuptools/58.1.0 requests-toolbelt/0.10.0 tqdm/4.64.1 CPython/3.9.14

File hashes

Hashes for azure_ai_ml-0.1.0b8-py3-none-any.whl
Algorithm Hash digest
SHA256 22ef5a3bfc96873d30cb695b4d31ded0c9a1e682098476971f8fd433ba6952fa
MD5 3b7795f7217199b5d591b59f6af6fe31
BLAKE2b-256 5b35325623f07972c8db2f073f574ba8367469b082c8419614d18d9f402b30c4

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page