Skip to main content

Anndata Tensorstore Extension: Save and Load anndata to/from Tensorstore for random access

Project description

# Tensorstore Extension for AnnData 

This extension provides support for reading and writing [AnnData](https://anndata.readthedocs.io/en/latest/) objects using the [Tensorstore](https://google.github.io/tensorstore/)

## Design Goals

- **The challenge**: Large single-cell anndata object are often too large to fit in [RAM](https://en.wikipedia.org/wiki/Random-access_memory) and save in disk.
- **The solution**: [Tensorstore](https://google.github.io/tensorstore/) provides a way to read and write data in a variety of formats, including [Zarr](https://zarr.dev/). This extension provides a way to read anndata objects with specific rows (cells) and columns (genes) from a Zarr store. **You will not need to load the entire anndata object into memory to access a subset of the data.**
- **Caveats**: This extension is still in development and may not support all features of AnnData objects.
- **Caveats**: This extension is not optimized for read/write speed.



## Installation

```bash
pip install anndata-tensorstore
```

## Usage

### Writing an AnnData object to a Tensorstore

```python
import anndata
import anndata_tensorstore as ats

anndata = anndata.read_h5ad("path/to/large_anndata.h5ad")
ats.save(anndata, "path/to/large_anndata.ats", is_raw_count=True)
```


### Reading an AnnData object from a Tensorstore

```python
import os
import anndata
import anndata_tensorstore as ats

# Load the entire data from the storage
adata = ats.load("path/to/large_anndata.ats")

# Load the anndata object from the storage, specifying the rows and columns to load
var = pd.read_parquet(os.path.join("path/to/large_anndata.ats", ats.ATS_FILE_NAME.var))
obs = pd.read_parquet(os.path.join("path/to/large_anndata.ats", ats.ATS_FILE_NAME.obs))

# Option 1: Load the partial data from the storage, specifying the rows and columns to load
adata = ats.load(
"path/to/large_anndata.ats",
obs_indices=slice(0, 1000), # the specification of columns and rows can either be
var_indices=var.index.isin(["gene1", "gene2"]) # a slice object or a boolean array
)

# Option 2: Load the partial data from the storage, specifying the rows and columns to load
adata = ats.load(
"path/to/large_anndata.ats",
obs_names=['barcode1','barcode2'], # the specification of columns and rows can either be
var_names=["gene1", "gene2"] # the index names of the obs and var dataframes
)

# Option 3: Load the partial data from the storage, specifying the rows and columns to load
adata = ats.load(
"path/to/large_anndata.ats",
obs_indices=[0, 1], # the specification of columns and rows can either be
var_indices=[0, 1] # a list of indices
)

# Option 4: Load the partial data from the storage, specifying the rows and columns to load
adata = ats.load(
"path/to/large_anndata.ats",
obs_selection=[("obs_column_name", ["cell_type_1", "cell_type_2"])],
var_selection=[("var_column_name", ["gene1", "gene2"])]
# the specification of columns and rows can either be
# a list of tuples where the first element is the column name and the
# second element is a list of values
)



```

## Development and Future Work

- [ ] reduce storage size
- [ ] support more AnnData features
- [ ] support more tensorstore features

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

anndata_tensorstore-1.0.4a1-py3-none-any.whl (7.1 kB view details)

Uploaded Python 3

File details

Details for the file anndata_tensorstore-1.0.4a1-py3-none-any.whl.

File metadata

File hashes

Hashes for anndata_tensorstore-1.0.4a1-py3-none-any.whl
Algorithm Hash digest
SHA256 73ca52ed744907ae95f7bfdf690351cc4b6c58f55d9018870be1679a267b5f86
MD5 3ae3ddf408c8564c29a92c3b2626e4c7
BLAKE2b-256 8dff2529365ce01ed3bda7be557ee25265b78eb806d793db7fba1927a81f5dc1

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page