Generic Azure integration layer for dl-core.
Project description
deep-learning-azure
Public Azure integration layer for deep-learning-core.
deep-learning-azure adds Azure ML execution, Azure storage helpers, and
Azure-oriented dataset wrappers on top of deep-learning-core.
Install it directly or through the deep-learning-core[azure] extra. The
package is kept separate so Azure-specific dependencies and scaffold wiring do
not leak into plain deep-learning-core installations.
Install
Install from PyPI through the core extra:
pip install "deep-learning-core[azure]"
Install the package directly:
pip install deep-learning-azure
Install in a uv project:
uv add "deep-learning-core[azure]" deep-learning-azure
Scope
- Azure ML executor
- Azure storage helpers and AzCopy wrappers
- Azure dataset wrappers
- Azure experiment scaffold integration through
dl-init --with-azure
Out Of Scope
- Generic trainer, dataset, and metric abstractions
- Public framework defaults
- Concrete experiment repositories
Quick Start
Install it into an experiment repository through the Azure extra:
uv add "deep-learning-core[azure]" deep-learning-azure
If the repository was scaffolded with dl-init --with-azure, the
experiment package will import dl_azure automatically so its executor
and generic dataset wrappers register at runtime, and the scaffold will also
create azure-config.json.
The Azure executor is sweep-oriented. Use
uv run dl-sweep experiments/lr_sweep.yaml --dry-run before the first real
submission in a new repository.
Concrete experiment flow:
uv init
uv add deep-learning-azure
uv run dl-init --root-dir . --with-azure
uv run dl-core add dataset AzureSeq --base azure_compute_multiframe
uv run dl-sweep experiments/lr_sweep.yaml --dry-run
Tracker naming defaults to the repository root name. If you want Azure job
submission and Azure MLflow to use a different destination name, set
tracking.experiment_name in your sweep config.
Azure submissions automatically rewrite the default local runtime.output_dir
from artifacts to outputs/artifacts inside the remote job. That keeps
checkpoints, plots, metrics, and other run files under Azure ML's managed
output directory without changing the local default artifact layout.
When you analyze an Azure-backed sweep with dl-analyze, the Azure metrics
source fetches only the metric histories requested on the CLI, for example:
uv run dl-analyze --sweep experiments/lr_sweep.yaml \
--metric test/eer --mode min \
--metric test/accuracy --mode max \
--rank-method rank-sum
Those fetched metric histories are cached in analysis_cache.json next to
sweep_tracking.json. Use --force to refresh them.
If you want the tracked Azure job outputs locally after the sweep finishes, run:
uv run dl-sync --sweep experiments/lr_sweep.yaml --artifacts
That downloads the Azure job bundle for each tracked run and patches
sweep_tracking.json with the resolved local artifact paths.
Concrete dataset scaffold examples:
uv run dl-core add dataset AzureImages --base azure_compute
uv run dl-core add dataset AzureFrames --base azure_compute_frame
uv run dl-core add dataset AzureSeq --base azure_compute_multiframe
uv run dl-core add dataset AzureStream --base azure_streaming
uv run dl-core add dataset AzureStreamSeq --base azure_streaming_multiframe
Dataset Wrapper Notes
Use the compute wrappers when the dataset is already mounted into the Azure ML job or available locally through a compatible directory layout:
AzureComputeWrapperAzureComputeFrameWrapperAzureComputeMultiFrameWrapper
Compute wrappers resolve the dataset root in this order:
dataset.root_dirAZURE_ML_INPUT_<input_name>dataset.local_fallback_rootwhendataset.allow_local_fallbackistrue
Use the streaming wrappers when you want to read directly from blob storage instead of relying on an Azure ML input mount:
AzureStreamingWrapperAzureStreamingFrameWrapperAzureStreamingMultiFrameWrapper
Streaming wrappers require dataset.container_name and an Azure storage config
that provides account_name, either in azure-config.json or inline in the
dataset config.
Frame wrappers share a few image-specific settings:
height/widthfor the output tensor shaperesize_height/resize_widthfor pre-augmentation resizinguse_face_detectionto enable metadata-driven face cropsmarginas an int, two-item sequence, or{height, width}mapping
If you enable face_detected_and_resized_cache, processed frame images are
stored in the wrapper cache when a cache backend is available. That is most
useful for the streaming frame wrappers, where blob reads can be cached locally.
Multiframe wrappers add one multiframe block:
dataset:
name: AzureSeq
input_name: dataset_path
allow_local_fallback: true
local_fallback_root: data/my_dataset
height: 224
width: 224
use_face_detection: true
face_detected_and_resized_cache: true
multiframe:
mode: consecutive
num_frames: 5
frame_stride: 2
multiframe.mode: random draws num_frames unique frames per sample.
multiframe.mode: consecutive walks each video in fixed windows and uses
frame_stride to skip frames between windows. Videos with fewer than
num_frames frames are skipped.
What You Get
- the
azureexecutor - Azure storage helpers and AzCopy wrappers
- generic Azure dataset foundations:
AzureComputeWrapper,AzureStreamingWrapper,AzureComputeFrameWrapper,AzureStreamingFrameWrapper,AzureComputeMultiFrameWrapper, andAzureStreamingMultiFrameWrapper dl-init --with-azurescaffold integration- a managed
.amlignoreblock that preserves user content while excluding common local-only outputs from Azure submissions - Azure job output routing to
outputs/artifactsfor automatic artifact persistence in Azure ML
Companion Packages
Documentation
License
MIT. See LICENSE.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file deep_learning_azure-0.0.13.tar.gz.
File metadata
- Download URL: deep_learning_azure-0.0.13.tar.gz
- Upload date:
- Size: 231.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6e7802f9ef18fa8fb7bcd9fc0c641089953a379d6b50efd61d52c034b9c83daf
|
|
| MD5 |
d85da8477799f1fca4827df4400d8372
|
|
| BLAKE2b-256 |
2478b918855f2f0eacd635be65e864cfa95d90a521d5e1d3f2e82986cb6e4d29
|
Provenance
The following attestation bundles were made for deep_learning_azure-0.0.13.tar.gz:
Publisher:
publish.yml on Blazkowiz47/dl-azure
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
deep_learning_azure-0.0.13.tar.gz -
Subject digest:
6e7802f9ef18fa8fb7bcd9fc0c641089953a379d6b50efd61d52c034b9c83daf - Sigstore transparency entry: 1279333313
- Sigstore integration time:
-
Permalink:
Blazkowiz47/dl-azure@896caa539e013038765672b29abf01e7d6b5de12 -
Branch / Tag:
refs/heads/main - Owner: https://github.com/Blazkowiz47
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@896caa539e013038765672b29abf01e7d6b5de12 -
Trigger Event:
workflow_dispatch
-
Statement type:
File details
Details for the file deep_learning_azure-0.0.13-py3-none-any.whl.
File metadata
- Download URL: deep_learning_azure-0.0.13-py3-none-any.whl
- Upload date:
- Size: 45.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
19718f51174cae57daebe3a0a815d5d1f6e9cfa9fec61796f51b249f5412c062
|
|
| MD5 |
d1c5a272aa8e1f591b4f886b80ec5fe6
|
|
| BLAKE2b-256 |
28a4503d819504ad452349015d3423d27912fb67460b0d2c366a97ffd3388d48
|
Provenance
The following attestation bundles were made for deep_learning_azure-0.0.13-py3-none-any.whl:
Publisher:
publish.yml on Blazkowiz47/dl-azure
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
deep_learning_azure-0.0.13-py3-none-any.whl -
Subject digest:
19718f51174cae57daebe3a0a815d5d1f6e9cfa9fec61796f51b249f5412c062 - Sigstore transparency entry: 1279333440
- Sigstore integration time:
-
Permalink:
Blazkowiz47/dl-azure@896caa539e013038765672b29abf01e7d6b5de12 -
Branch / Tag:
refs/heads/main - Owner: https://github.com/Blazkowiz47
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@896caa539e013038765672b29abf01e7d6b5de12 -
Trigger Event:
workflow_dispatch
-
Statement type: