Skip to main content

A high performance dataloader for Python, written in Rust

Project description

datago

Rust Rust-py

A Rust-written data loader which can be used as a python module. Handles several data sources, from local files to webdataset or a VectorDB focused http stack soon-to-be open sourced. Focused on image data at the moment, could also easily be more generic.

Datago handles, outside of the Python GIL

  • per sample IO
  • deserialization (jpg and png decompression)
  • some optional vision processing (aligning different image payloads)
  • optional serialization

Samples are exposed in the Python scope as python native objects, using PIL and Numpy base types. Speed will be network dependent, but GB/s is typical. Depending on the front ends, datago can be rank and world-size aware, in which case the samples are dispatched depending on the samples hash.

Datago organization

Use it

You can simply install datago with [uv] pip install datago

Use the package from Python

Please note that in all the of the following cases, you can directly get an IterableDataset (torch compatible) with the following code snippet

from dataset import DatagoIterDataset
client_config = {} # See below for examples
datago_dataset = DatagoIterDataset(client_config, return_python_types=True)

return_python_types enforces that images will be of the PIL.Image sort for instance, being an external binary module should be transparent.

Dataroom
from datago import DatagoClient, initialize_logging
import os
import json

# Respects RUST_LOG=INFO env var for setting log level
# If omitted the logger will be initialized when the client starts.
initialize_logging()

config = {
    "source_config": {
        "sources": os.environ.get("DATAROOM_TEST_SOURCE", ""),
        "page_size": 500,
        "rank": 0,
        "world_size": 1,
    },
    "limit": 200,
    "samples_buffer_size": 32,
}

client = DatagoClient(json.dumps(config))

for _ in range(10):
    sample = client.get_sample()

Please note that the image buffers will be passed around as raw pointers, see below (we provide python utils to convert to PIL types).

Local files

To test datago while serving local files (jpg, png, ..), code would look like the following. Note that datago serving files with a lot of concurrent threads means that, even if random_sampling is not set, there will be some randomness in the sample ordering.

from datago import DatagoClient, initialize_logging
import os
import json

# Can also set the log level directly instead of using RUST_LOG env var
initialize_logging(log_level="warn")

config = {
    "source_type": "file",
    "source_config": {
        "root_path": "myPath",
        "random_sampling": False, # True if used directly for training
        "rank": 0,
        "world_size": 1,
    },
    "limit": 200,
    "samples_buffer_size": 32,
}

client = DatagoClient(json.dumps(config))

for _ in range(10):
    sample = client.get_sample()
[experimental] Webdataset

Please note that this implementation is very new, and probably has significant limitations still. It has not yet been tested at scale. Please also note that you can find a better example in /python/benchmark_webdataset.py, which will show how to convert everything to more pythonic types (PIL images).

from datago import DatagoClient, initialize_logging
import os
import json

# Can also set the log level directly instead of using RUST_LOG env var
initialize_logging(log_level="warn")

# URL of the test bucket
bucket = "https://storage.googleapis.com/webdataset/fake-imagenet"
dataset = "/imagenet-train-{000000..001281}.tar"
url = bucket + dataset

client_config = {
    "source_type": "webdataset",
    "source_config": {
        "url": url,
        "random_sampling": False,
        "max_concurrency": 8, # The number of TarballSamples which should be handled concurrently
        "rank": 0,
        "world_size": 1,
    },
    # Optional pre-processing of the images, placing them in an aspect ratio bucket to preseve as much as possible of the original content
    "image_config": {
        "crop_and_resize": True, # False to turn it off, or just omit this part of the config
        "default_image_size": 1024,
        "downsampling_ratio": 32,
        "min_aspect_ratio": 0.5,
        "max_aspect_ratio": 2.0,
        "pre_encode_images": False,
    },
    "prefetch_buffer_size": 128,
    "samples_buffer_size": 64,
    "limit": 1_000_000, # Dummy example, max number of samples you would like to serve
}

client = DatagoClient(json.dumps(client_config))

for _ in range(10):
    sample = client.get_sample()

Match the raw exported buffers with typical python types

See helper functions provided in raw_types.py, should be self explanatory. Check python benchmarks for examples. As mentioned above, we also provide a wrapper so that you get a dataset directly.

Logging

We are using the log crate with env_logger. You can set the log level using the RUST_LOG environment variable. E.g. RUST_LOG=INFO.

When using the library from Python, env_logger will be initialized automatically when creating a DatagoClient. There is also a initialize_logging function in the datago module, which if called before using a client, allows to customize the log level. This only works if RUST_LOG is not set.

Build it

Preamble

Just install the rust toolchain via rustup

[Apple Silicon MacOS only]

If you are using an Apple Silicon Mac OS machine, create a .cargo/config file and paste the following:

[target.x86_64-apple-darwin]
rustflags = [
  "-C", "link-arg=-undefined",
  "-C", "link-arg=dynamic_lookup",
]

[target.aarch64-apple-darwin]
rustflags = [
  "-C", "link-arg=-undefined",
  "-C", "link-arg=dynamic_lookup",
]

Build a benchmark CLI

Cargo run --release -- -h to get all the information, should be fairly straightforward

Run the rust test suite

From the datago folder

cargo test

Generate the python package binaries manually

Build a wheel useable locally

maturin build -i python3.11 --release --target "x86_64-unknown-linux-gnu"

Build a wheel which can be uploaded to pypi or related

  • either use a manylinux docker image

  • or cross compile using zip

maturin build -i python3.11 --release --target "x86_64-unknown-linux-gnu" --manylinux 2014 --zig

then you can pip install from target/wheels

Update the pypi release (maintainers)

Create a new tag and a new release in this repo, a new package will be pushed automatically.

License

MIT License

Copyright (c) 2025 Photoroom

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

datago-2025.6.5.tar.gz (293.4 kB view details)

Uploaded Source

Built Distributions

datago-2025.6.5-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (4.7 MB view details)

Uploaded CPython 3.12manylinux: glibc 2.17+ x86-64

datago-2025.6.5-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (4.7 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.17+ x86-64

datago-2025.6.5-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (4.7 MB view details)

Uploaded CPython 3.10manylinux: glibc 2.17+ x86-64

datago-2025.6.5-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (4.7 MB view details)

Uploaded CPython 3.9manylinux: glibc 2.17+ x86-64

datago-2025.6.5-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (4.7 MB view details)

Uploaded CPython 3.8manylinux: glibc 2.17+ x86-64

File details

Details for the file datago-2025.6.5.tar.gz.

File metadata

  • Download URL: datago-2025.6.5.tar.gz
  • Upload date:
  • Size: 293.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: maturin/1.9.0

File hashes

Hashes for datago-2025.6.5.tar.gz
Algorithm Hash digest
SHA256 ddaa7c11d3862a6102d57a40fcf4015c53d9c86ebcd9235f6e359a0f68e05bac
MD5 19e9a07413e06e7ccc8359cb08d7c1c3
BLAKE2b-256 85dad74910c84666e116f3cbae16de60dcb91f72e33eca3e74baa02d62546c37

See more details on using hashes here.

File details

Details for the file datago-2025.6.5-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for datago-2025.6.5-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 befe0abd367e339efd8c6163a562a5450a8703a60b53adc5901186fa4380433a
MD5 1558933880bb8410340bde5902dc4bf1
BLAKE2b-256 40829610cc5b004b95f2bdfbfcc04424332df9504c742c42face2abb628dc80b

See more details on using hashes here.

File details

Details for the file datago-2025.6.5-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for datago-2025.6.5-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 f6dabc7adad0c961e77729f1256a71d66f3b7336b3f7d600fdb7e19e2679e5f4
MD5 6bd6ed280e531c31aaaa78748fc338f5
BLAKE2b-256 be9f25c84fee484b65c7e9823910a716d771c29bdac9df4934f67c9d67ca0a84

See more details on using hashes here.

File details

Details for the file datago-2025.6.5-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for datago-2025.6.5-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 6ae42f6495cd69def2b245568feb12160f8eef515a70d121a76d5526c187571b
MD5 1977fdccffe4836c9519a4833056de9d
BLAKE2b-256 16daaef16e3efdd5a2ab5ce38f47c07c1fa357752709e0727295169cfa16f603

See more details on using hashes here.

File details

Details for the file datago-2025.6.5-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for datago-2025.6.5-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 5e174484f9e1a8303f06915be580ee341f0d735e8d24171e74332a23653d028c
MD5 c22d370c68e50f9dbd98f0f0e2bc4110
BLAKE2b-256 b3a0cea35d2a8663a3694497a36f725bb7eee326d397be9c6524d0222f7098f8

See more details on using hashes here.

File details

Details for the file datago-2025.6.5-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for datago-2025.6.5-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 6d569d799c16550b8ed4b433bf049471e33a91af451ba470c33ad943cedddddc
MD5 77f1d18da866f1c3bcf509ca8ae6a480
BLAKE2b-256 4225373d19aa3be65acdbff2eec9c9e91c9c763efaeb6634059b7ec950461c75

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page