Skip to main content

Equiareal batch sampler

Project description

Equiareal batch sampler

Standard practice in Deep Learning is to train models on batches of data, keeping the number of samples in a batch ("batch size") constant. However, what you really need is a constant memory footprint of a batch, to i.e. coordinate it with the memory of your GPU. If your samples have different sizes (common in text, time series data), constant batch size will lead to a highly variable memory footprint. This package provides a batch sampler that keeps constant "batch area": sum of lengths of samples. Batch area mostly corresponds to its memory footprint, although padding will increase it slightly.

Installation

pip install equibatch

Usage

from equibatch import EquiarealBatchSampler

data = [
    'London',
    'Birmingham',
    'Glasgow',
    'Llanfairpwllgwyngyllgogerychwyrndrobwllllantysiliogogogoch',
    'Liverpool',
    'Bristol',
    'Manchester'
]

batch_sampler = EquiarealBatchSampler(
    sampler = range(len(data)), # the order in which the dataset is traversed
    len_checker = lambda ix: len(data[ix]), # definition of "footprint of a sample"
    max_size = 10, # maximum number of samples in a batch
    max_footprint = 60 # maximum cumulative footprint of a batch
    )

for batch in batch_sampler:
    sample = [data[ix] for ix in batch] 
    print(sample)

This will print

['London', 'Birmingham', 'Glasgow']
['Llanfairpwllgwyngyllgogerychwyrndrobwllllantysiliogogogoch']
['Liverpool', 'Bristol', 'Manchester']

Pytorch support

If you have Pytorch installed, EquiarealBatchSampler will be a subclass of torch.utils.data.sampler.BatchSampler and you can use it in your torch.utils.data.DataLoader like this:

from torch.utils.data import DataLoader

dataloader = DataLoader(
    dataset = data,
    batch_sampler = batch_sampler
)

for input in dataloader:
    output = model(input)
    loss = loss_fn(output)
    loss.backward()
    optimizer.step()

Alternatives

If you are using torchtext, similar results can be achieved using batch_size_fn parameter in torchtext.data.Iterator

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

equibatch-0.1.3.tar.gz (2.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

equibatch-0.1.3-py3-none-any.whl (2.6 kB view details)

Uploaded Python 3

File details

Details for the file equibatch-0.1.3.tar.gz.

File metadata

  • Download URL: equibatch-0.1.3.tar.gz
  • Upload date:
  • Size: 2.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.0.1 CPython/3.12.5 Darwin/23.6.0

File hashes

Hashes for equibatch-0.1.3.tar.gz
Algorithm Hash digest
SHA256 3d76c6b5b26801df3a6aa5595463845baab20e9073bfe5a38c76b255760ff690
MD5 de96d51ebf5a22a8f8957a1815ceea3d
BLAKE2b-256 e71287547b93bcb803f051b1c654a86a9f551b877cc04fc4b300472419b1fa26

See more details on using hashes here.

File details

Details for the file equibatch-0.1.3-py3-none-any.whl.

File metadata

  • Download URL: equibatch-0.1.3-py3-none-any.whl
  • Upload date:
  • Size: 2.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.0.1 CPython/3.12.5 Darwin/23.6.0

File hashes

Hashes for equibatch-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 bfcebeee6e93e39f2d3a1371c31f986c8b00d7b1ed77c260cae30ddb983cc2d7
MD5 89e33b880f6459c14e726516a591a95a
BLAKE2b-256 bfda6139bc0cd1e17a17b362433b59fb255db4b3173e0e9a52c168c1a6429d9d

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page