Skip to main content

Diffusion models in PyTorch

Project description

Azula's banner

Azula - Diffusion models in PyTorch

Azula is a Python package that implements diffusion models in PyTorch. Its goal is to unify the different formalisms and notations of the generative diffusion models literature into a single, convenient and hackable interface.

In the Avatar cartoon, Azula is a powerful fire and lightning bender ⚡️

Installation

The azula package is available on PyPI, which means it is installable via pip.

pip install azula

Alternatively, if you need the latest features, you can install it from the repository.

pip install git+https://github.com/probabilists/azula

Getting started

In Azula's formalism, a diffusion model is the composition of three elements: a noise schedule, a denoiser and a sampler.

  • A noise schedule is a mapping from a time $t \in [0, 1]$ to the signal scale $\alpha_t$ and the noise scale $\sigma_t$ in a perturbation kernel $p(X_t \mid X) = \mathcal{N}(X_t \mid \alpha_t X_t, \sigma_t^2 I)$ from a "clean" random variable $X \sim p(X)$ to a "noisy" random variable $X_t$.

  • A denoiser is a neural network trained to predict $X$ given $X_t$.

  • A sampler defines a series of transition kernels $q(X_s \mid X_t)$ from $t$ to $s < t$ based on a noise schedule and a denoiser. Simulating these transitions from $t = 1$ to $0$ samples approximately from $p(X)$.

This formalism is closely followed by Azula's API.

from azula.denoise import PreconditionedDenoiser
from azula.noise import VPSchedule
from azula.sample import DDPMSampler

# Choose the variance preserving (VP) noise schedule
schedule = VPSchedule()

# Initialize a denoiser
denoiser = PreconditionedDenoiser(
    backbone=CustomNN(in_features=5, out_features=5),
    schedule=schedule,
)

# Train to predict x given x_t
optimizer = torch.optim.Adam(denoiser.parameters(), lr=1e-3)

for x in train_loader:
    t = torch.rand((batch_size,))

    loss = denoiser.loss(x, t).mean()
    loss.backward()

    optimizer.step()
    optimizer.zero_grad()

# Generate 64 points in 1000 steps
sampler = DDPMSampler(denoiser.eval(), steps=1000)

x1 = sampler.init((64, 5))
x0 = sampler(x1)

Alternatively, Azula's plugin interface allows to load pre-trained models and use them with the same convenient interface.

import sys

sys.path.append("path/to/guided-diffusion")

from azula.plugins import adm
from azula.sample import DDIMSampler

# Download weights from openai/guided-diffusion
denoiser = adm.load_model("imagenet_256x256")

# Generate a batch of 4 images
sampler = DDIMSampler(denoiser, steps=64).cuda()

x1 = sampler.init((4, 3 * 256 * 256))
x0 = sampler(x1)

images = torch.clip((x0 + 1) / 2, min=0, max=1).reshape(4, 3, 256, 256)

For more information, check out the documentation and tutorials at azula.readthedocs.io.

Contributing

If you have a question, an issue or would like to contribute, please read our contributing guidelines.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

azula-0.2.4.tar.gz (28.1 kB view details)

Uploaded Source

Built Distribution

azula-0.2.4-py3-none-any.whl (32.9 kB view details)

Uploaded Python 3

File details

Details for the file azula-0.2.4.tar.gz.

File metadata

  • Download URL: azula-0.2.4.tar.gz
  • Upload date:
  • Size: 28.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.10.14

File hashes

Hashes for azula-0.2.4.tar.gz
Algorithm Hash digest
SHA256 02cd241c7f68ef88eb91fe2c16be66f6ced5ff42ff2b4156192e8cc3da8ea2ce
MD5 9eed71d524d4e9aa779357cb26f73c7e
BLAKE2b-256 8d2447c557ed2eb63ef4bfea61c5786fbdb8b64c3c655b49ed291e7694105004

See more details on using hashes here.

File details

Details for the file azula-0.2.4-py3-none-any.whl.

File metadata

  • Download URL: azula-0.2.4-py3-none-any.whl
  • Upload date:
  • Size: 32.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.10.14

File hashes

Hashes for azula-0.2.4-py3-none-any.whl
Algorithm Hash digest
SHA256 806cee6f822fbd4091243f62e0092f416dde84d9059e270157fcc3a0e611a3f9
MD5 5ee9d1407afb764386ae25975a7c39e5
BLAKE2b-256 8a6bc261d139ed57c9825c2d9a240d3438858c4e40a64852ecc8e47d57696053

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page