Skip to main content

Generic Azure integration layer for dl-core.

Project description

deep-learning-azure

Public Azure integration layer for deep-learning-core.

deep-learning-azure adds Azure ML execution, Azure storage helpers, and Azure-oriented dataset wrappers on top of deep-learning-core.

Install it directly or through the deep-learning-core[azure] extra. The package is kept separate so Azure-specific dependencies and scaffold wiring do not leak into plain deep-learning-core installations.

Install

Install from PyPI through the core extra:

pip install "deep-learning-core[azure]"

Install the package directly:

pip install deep-learning-azure

Install in a uv project:

uv add "deep-learning-core[azure]" deep-learning-azure

Scope

  • Azure ML executor
  • Azure storage helpers and AzCopy wrappers
  • Azure dataset wrappers
  • Azure experiment scaffold integration through dl-init --with-azure

Out Of Scope

  • Generic trainer, dataset, and metric abstractions
  • Public framework defaults
  • Concrete experiment repositories

Quick Start

Install it into an experiment repository through the Azure extra:

uv add "deep-learning-core[azure]" deep-learning-azure

If the repository was scaffolded with dl-init --with-azure, the experiment package will import dl_azure automatically so its executor and generic dataset wrappers register at runtime, and the scaffold will also create azure-config.json.

The Azure executor is sweep-oriented. Use uv run dl-sweep experiments/lr_sweep.yaml --dry-run before the first real submission in a new repository.

Concrete experiment flow:

uv init
uv add deep-learning-azure
uv run dl-init --root-dir . --with-azure
uv run dl-core add dataset AzureSeq --base azure_compute_multiframe
uv run dl-sweep experiments/lr_sweep.yaml --dry-run

Tracker naming defaults to the repository root name. If you want Azure job submission and Azure MLflow to use a different destination name, set tracking.experiment_name in your sweep config.

Azure submissions automatically rewrite the default local runtime.output_dir from artifacts to outputs/artifacts inside the remote job. That keeps checkpoints, plots, metrics, and other run files under Azure ML's managed output directory without changing the local default artifact layout.

When you analyze an Azure-backed sweep with dl-analyze, the Azure metrics source fetches only the metric histories requested on the CLI, for example:

uv run dl-analyze --sweep experiments/lr_sweep.yaml \
  --metric test/eer --mode min \
  --metric test/accuracy --mode max \
  --rank-method rank-sum

Those fetched metric histories are cached in analysis_cache.json next to sweep_tracking.json. Use --force to refresh them.

If you want the tracked Azure job outputs locally after the sweep finishes, run:

uv run dl-sync --sweep experiments/lr_sweep.yaml --artifacts

That downloads the Azure job bundle for each tracked run and patches sweep_tracking.json with the resolved local artifact paths.

Concrete dataset scaffold examples:

uv run dl-core add dataset AzureImages --base azure_compute
uv run dl-core add dataset AzureFrames --base azure_compute_frame
uv run dl-core add dataset AzureSeq --base azure_compute_multiframe
uv run dl-core add dataset AzureStream --base azure_streaming
uv run dl-core add dataset AzureStreamSeq --base azure_streaming_multiframe

What You Get

  • the azure executor
  • Azure storage helpers and AzCopy wrappers
  • generic Azure dataset foundations: AzureComputeWrapper, AzureStreamingWrapper, AzureComputeFrameWrapper, AzureStreamingFrameWrapper, AzureComputeMultiFrameWrapper, and AzureStreamingMultiFrameWrapper
  • dl-init --with-azure scaffold integration
  • a managed .amlignore block that preserves user content while excluding common local-only outputs from Azure submissions
  • Azure job output routing to outputs/artifacts for automatic artifact persistence in Azure ML

Companion Packages

Documentation

License

MIT. See LICENSE.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

deep_learning_azure-0.0.12.tar.gz (229.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

deep_learning_azure-0.0.12-py3-none-any.whl (44.4 kB view details)

Uploaded Python 3

File details

Details for the file deep_learning_azure-0.0.12.tar.gz.

File metadata

  • Download URL: deep_learning_azure-0.0.12.tar.gz
  • Upload date:
  • Size: 229.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for deep_learning_azure-0.0.12.tar.gz
Algorithm Hash digest
SHA256 635153e0deec26958cb85799a92039416da95fba27db12e230bcfe915e7b9fd4
MD5 3db563e48abdbcc57660c433b029ec4f
BLAKE2b-256 b9dddedd97328817645b6b108784ea93a178a569bd21e16e0d00ce032fccf436

See more details on using hashes here.

Provenance

The following attestation bundles were made for deep_learning_azure-0.0.12.tar.gz:

Publisher: publish.yml on Blazkowiz47/dl-azure

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file deep_learning_azure-0.0.12-py3-none-any.whl.

File metadata

File hashes

Hashes for deep_learning_azure-0.0.12-py3-none-any.whl
Algorithm Hash digest
SHA256 6c186bb120b33cf6a6598f4f6aff5f137660f1239699466c6908dabc419ba737
MD5 d9fb230ecf68ad2fc4c8a3955c8470b9
BLAKE2b-256 c284752cfdc79ff55a4afe6a3dd67040c2c2fdc441b0d48ce4aaf545af9b8cbd

See more details on using hashes here.

Provenance

The following attestation bundles were made for deep_learning_azure-0.0.12-py3-none-any.whl:

Publisher: publish.yml on Blazkowiz47/dl-azure

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page