Skip to main content

Distributed Dataframes for Multimodal Data

Project description

Daft dataframes can load any data such as PDF documents, images, protobufs, csv, parquet and audio files into a table dataframe structure for easy querying

Github Actions tests PyPI latest tag Coverage slack community

WebsiteDocsInstallation10-minute tour of DaftCommunity and Support

Daft: Distributed dataframes for multimodal data

Daft is a distributed query engine for large-scale data processing in Python and is implemented in Rust.

  • Familiar interactive API: Lazy Python Dataframe for rapid and interactive iteration

  • Focus on the what: Powerful Query Optimizer that rewrites queries to be as efficient as possible

  • Data Catalog integrations: Full integration with data catalogs such as Apache Iceberg

  • Rich multimodal type-system: Supports multimodal types such as Images, URLs, Tensors and more

  • Seamless Interchange: Built on the Apache Arrow In-Memory Format

  • Built for the cloud: Record-setting I/O performance for integrations with S3 cloud storage

Table of Contents

About Daft

Daft was designed with the following principles in mind:

  1. Any Data: Beyond the usual strings/numbers/dates, Daft columns can also hold complex or nested multimodal data such as Images, Embeddings and Python objects efficiently with it’s Arrow based memory representation. Ingestion and basic transformations of multimodal data is extremely easy and performant in Daft.

  2. Interactive Computing: Daft is built for the interactive developer experience through notebooks or REPLs - intelligent caching/query optimizations accelerates your experimentation and data exploration.

  3. Distributed Computing: Some workloads can quickly outgrow your local laptop’s computational resources - Daft integrates natively with Ray for running dataframes on large clusters of machines with thousands of CPUs/GPUs.

Getting Started

Installation

Install Daft with pip install getdaft.

For more advanced installations (e.g. installing from source or with extra dependencies such as Ray and AWS utilities), please see our Installation Guide

Quickstart

Check out our 10-minute quickstart!

In this example, we load images from an AWS S3 bucket’s URLs and resize each image in the dataframe:

import daft

# Load a dataframe from filepaths in an S3 bucket
df = daft.from_glob_path("s3://daft-public-data/laion-sample-images/*")

# 1. Download column of image URLs as a column of bytes
# 2. Decode the column of bytes into a column of images
df = df.with_column("image", df["path"].url.download().image.decode())

# Resize each image into 32x32
df = df.with_column("resized", df["image"].image.resize(32, 32))

df.show(3)

Dataframe code to load a folder of images from AWS S3 and create thumbnails

Benchmarks

Benchmarks for SF100 TPCH

To see the full benchmarks, detailed setup, and logs, check out our benchmarking page.

More Resources

  • 10-minute tour of Daft - learn more about Daft’s full range of capabilities including dataloading from URLs, joins, user-defined functions (UDF), groupby, aggregations and more.

  • User Guide - take a deep-dive into each topic within Daft

  • API Reference - API reference for public classes/functions of Daft

Contributing

To start contributing to Daft, please read CONTRIBUTING.md

Here’s a list of good first issues to get yourself warmed up with Daft. Comment in the issue to pick it up, and feel free to ask any questions!

Telemetry

To help improve Daft, we collect non-identifiable data.

To disable this behavior, set the following environment variable: DAFT_ANALYTICS_ENABLED=0

The data that we collect is:

  1. Non-identifiable: events are keyed by a session ID which is generated on import of Daft

  2. Metadata-only: we do not collect any of our users’ proprietary code or data

  3. For development only: we do not buy or sell any user data

Please see our documentation for more details.

https://static.scarf.sh/a.png?x-pxid=cd444261-469e-473b-b9ba-f66ac3dc73ee

License

Daft has an Apache 2.0 license - please see the LICENSE file.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

getdaft-0.3.1.tar.gz (3.6 MB view details)

Uploaded Source

Built Distributions

getdaft-0.3.1-cp38-abi3-win_amd64.whl (26.7 MB view details)

Uploaded CPython 3.8+ Windows x86-64

getdaft-0.3.1-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (29.3 MB view details)

Uploaded CPython 3.8+ manylinux: glibc 2.17+ x86-64

getdaft-0.3.1-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (28.0 MB view details)

Uploaded CPython 3.8+ manylinux: glibc 2.17+ ARM64

getdaft-0.3.1-cp38-abi3-macosx_11_0_arm64.whl (24.4 MB view details)

Uploaded CPython 3.8+ macOS 11.0+ ARM64

getdaft-0.3.1-cp38-abi3-macosx_10_12_x86_64.whl (26.4 MB view details)

Uploaded CPython 3.8+ macOS 10.12+ x86-64

File details

Details for the file getdaft-0.3.1.tar.gz.

File metadata

  • Download URL: getdaft-0.3.1.tar.gz
  • Upload date:
  • Size: 3.6 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.9

File hashes

Hashes for getdaft-0.3.1.tar.gz
Algorithm Hash digest
SHA256 3e6d4441927979c20737b41beb6f8f108282d8c585e3ad1fcc1c13c00a541c30
MD5 cc067d5c884ca96fd1b44059522ba902
BLAKE2b-256 375229deb9324c8be756d1fb8cbcd685ceb92d8a86927c2825f7fb7a1a2420b8

See more details on using hashes here.

File details

Details for the file getdaft-0.3.1-cp38-abi3-win_amd64.whl.

File metadata

  • Download URL: getdaft-0.3.1-cp38-abi3-win_amd64.whl
  • Upload date:
  • Size: 26.7 MB
  • Tags: CPython 3.8+, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.9

File hashes

Hashes for getdaft-0.3.1-cp38-abi3-win_amd64.whl
Algorithm Hash digest
SHA256 8b41e6ac7c36cb86cc2965a99e49b5e22ab9a4e61172b144c67cef562c8269a5
MD5 95970044c87d19a3255bcb9f65c4091e
BLAKE2b-256 8b4f3596b76b83c529682947cece2a104f3f5fdd2680cf21a8abfb63b1e0feb5

See more details on using hashes here.

File details

Details for the file getdaft-0.3.1-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.3.1-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 30b044102d30b633fe081545e83171f2403c94f0cefdc7cbfc636b0070b80ddc
MD5 5a1ff52fc2e8899931800af5d0cd7ea8
BLAKE2b-256 24d51ae74d130f2e673b08353f25d673e463d5860af7532074ce6f5077240af9

See more details on using hashes here.

File details

Details for the file getdaft-0.3.1-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for getdaft-0.3.1-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 1bcbc62864922380c5bede3ccd6b472e1d9831037df59376dc158a89400f8f39
MD5 2f2ebbde1606deed1579afa42ab961ad
BLAKE2b-256 c8d22dbc482b1c60e42e75ee289d45c4293604d8c964248936661972a9027623

See more details on using hashes here.

File details

Details for the file getdaft-0.3.1-cp38-abi3-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for getdaft-0.3.1-cp38-abi3-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 a8123206967e394b9098bb9f94e30074f631a97ddfe3ec5215a13d9a7c0d7046
MD5 bed62588bc9bfeba3e5a4e2351ece9f7
BLAKE2b-256 c4fb996170cbc613ef4fc1b121128de3d30d0d606bdb253705fd94bb0d6f799d

See more details on using hashes here.

File details

Details for the file getdaft-0.3.1-cp38-abi3-macosx_10_12_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.3.1-cp38-abi3-macosx_10_12_x86_64.whl
Algorithm Hash digest
SHA256 9fd346e504426ee0b1b1a0a148859eba0d6c88111560792c182c20535aa0a76a
MD5 1239bb1f260d3fc162fb569cc33a053c
BLAKE2b-256 9723ce40193278a35f5366040635d961d8ee1a9f89e36a3b765d879a893dcc47

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page