Skip to main content

A minibatch loader for AnnData stores

Project description

annbatch

[!CAUTION] This package does not have a stable API. However, we do not anticipate the on-disk format to change in a fully incompatible manner. Small changes to how we store the shuffled data may occur but you should always be able to load your data somehow i.e., they will never be fully breaking. We will always provide lower-level APIs that should make this guarantee possible.

Tests Documentation PyPI Downloads Downloads

A data loader and io utilities for minibatching on-disk AnnData, co-developed by Lamin Labs and scverse

Getting started

Please refer to the documentation, in particular, the API documentation.

Installation

You need to have Python 3.12 or newer installed on your system. If you don't have Python installed, we recommend installing uv.

To install the latest release of annbatch from PyPI:

pip install annbatch

We provide extras for torch, cupy-cuda12, cupy-cuda13, and zarrs-python. cupy provides accelerated handling of the data via preload_to_gpu once it has been read off disk and does not need to be used in conjunction with torch.

[!IMPORTANT] zarrs-python gives the necessary performance boost for the sharded data produced by our preprocessing functions to be useful when loading data off a local filesystem.

Detailed tutorial

For a detailed tutorial, please see the in-depth section of our docs

Basic usage example

Basic preprocessing:

from annbatch import DatasetCollection

import zarr
from pathlib import Path

# Using zarrs is necessary for local filesystem performance.
# Ensure you installed it using our `[zarrs]` extra i.e., `pip install annbatch[zarrs]` to get the right version.
zarr.config.set(
    {"codec_pipeline.path": "zarrs.ZarrsCodecPipeline"}
)

# Create a collection at the given path. The subgroups will all be anndata stores.
collection = DatasetCollection("path/to/output/collection.zarr")
collection.add_adatas(
    adata_paths=[
        "path/to/your/file1.h5ad",
        "path/to/your/file2.h5ad"
    ],
    shuffle=True,  # shuffling is needed if you want to use chunked access, but is the default
)

Data loading:

[!IMPORTANT] Without custom loading via Loader.load_adata all obs columns will be loaded and yielded potentially degrading performance.

from pathlib import Path

from annbatch import Loader
import anndata as ad
import zarr

# Using zarrs is necessary for local filesystem performance.
# Ensure you installed it using our `[zarrs]` extra i.e., `pip install annbatch[zarrs]` to get the right version.
zarr.config.set(
    {"codec_pipeline.path": "zarrs.ZarrsCodecPipeline"}
)

# WARNING: Without custom loading *all* obs columns will be loaded and yielded potentially degrading performance.
def custom_load_func(g: zarr.Group) -> ad.AnnData:
    return ad.AnnData(X=ad.io.sparse_dataset(g["layers"]["counts"]), obs=ad.io.read_elem(g["obs"])[some_subset_of_columns_useful_for_training])

# A non empty collection
collection = DatasetCollection("path/to/output/collection.zarr")
# This settings override ensures that you don't lose/alter your categorical codes when reading the data in!
with ad.settings.override(remove_unused_categories=False):
    ds = Loader(
        batch_size=4096,
        chunk_size=32,
        preload_nchunks=256,
    )
    # `use_collection` automatically uses the on-disk `X` and full `obs` in the `Loader`
    # but the `load_adata` arg can override this behavior
    # (see `custom_load_func` above for an example of customization).
    ds = ds.use_collection(collection, load_adata = custom_load_func)

# Iterate over dataloader (plugin replacement for torch.utils.DataLoader)
for batch in ds:
    data, obs = batch["X"], batch["obs"]

[!IMPORTANT] For usage of our loader inside of torch, please see this note for more info. At the minimum, be aware that deadlocking will occur on linux unless you pass multiprocessing_context="spawn" to the torch.utils.data.DataLoader class.

Release notes

See the changelog.

Contact

For questions and help requests, you can reach out in the scverse discourse. If you found a bug, please use the issue tracker.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

annbatch-0.0.5.tar.gz (238.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

annbatch-0.0.5-py3-none-any.whl (30.2 kB view details)

Uploaded Python 3

File details

Details for the file annbatch-0.0.5.tar.gz.

File metadata

  • Download URL: annbatch-0.0.5.tar.gz
  • Upload date:
  • Size: 238.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for annbatch-0.0.5.tar.gz
Algorithm Hash digest
SHA256 4102f3c68fd8aa1aeb9c90c71d8b66d6f5c3a097fcde79fd2f1db0510d6e05cd
MD5 831c29ff437dbd53aeaa1d8292e91a71
BLAKE2b-256 822757283b74f3372e61f0dbb97f452471ef5c82eadbc8f856170e5be3e950f5

See more details on using hashes here.

Provenance

The following attestation bundles were made for annbatch-0.0.5.tar.gz:

Publisher: release.yaml on scverse/annbatch

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file annbatch-0.0.5-py3-none-any.whl.

File metadata

  • Download URL: annbatch-0.0.5-py3-none-any.whl
  • Upload date:
  • Size: 30.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for annbatch-0.0.5-py3-none-any.whl
Algorithm Hash digest
SHA256 f04429c11266cd6fac3511770daff29a2750b29a63f04bec2ae1db0f58180577
MD5 e90630b93c124570979431041aa311f0
BLAKE2b-256 b668f3c1f8073743ef92f4b719e1e9a6b5264a82edf72231dcf0b8cfe0c8f40f

See more details on using hashes here.

Provenance

The following attestation bundles were made for annbatch-0.0.5-py3-none-any.whl:

Publisher: release.yaml on scverse/annbatch

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page