Skip to main content

Distributed Dataframes for Multimodal Data

Project description

Daft dataframes can load any data such as PDF documents, images, protobufs, csv, parquet and audio files into a table dataframe structure for easy querying

Github Actions tests PyPI latest tag Coverage slack community

WebsiteDocsInstallation10-minute tour of DaftCommunity and Support

Daft: Distributed dataframes for multimodal data

Daft is a distributed query engine for large-scale data processing in Python and is implemented in Rust.

  • Familiar interactive API: Lazy Python Dataframe for rapid and interactive iteration

  • Focus on the what: Powerful Query Optimizer that rewrites queries to be as efficient as possible

  • Data Catalog integrations: Full integration with data catalogs such as Apache Iceberg

  • Rich multimodal type-system: Supports multimodal types such as Images, URLs, Tensors and more

  • Seamless Interchange: Built on the Apache Arrow In-Memory Format

  • Built for the cloud: Record-setting I/O performance for integrations with S3 cloud storage

Table of Contents

About Daft

Daft was designed with the following principles in mind:

  1. Any Data: Beyond the usual strings/numbers/dates, Daft columns can also hold complex or nested multimodal data such as Images, Embeddings and Python objects efficiently with it’s Arrow based memory representation. Ingestion and basic transformations of multimodal data is extremely easy and performant in Daft.

  2. Interactive Computing: Daft is built for the interactive developer experience through notebooks or REPLs - intelligent caching/query optimizations accelerates your experimentation and data exploration.

  3. Distributed Computing: Some workloads can quickly outgrow your local laptop’s computational resources - Daft integrates natively with Ray for running dataframes on large clusters of machines with thousands of CPUs/GPUs.

Getting Started

Installation

Install Daft with pip install getdaft.

For more advanced installations (e.g. installing from source or with extra dependencies such as Ray and AWS utilities), please see our Installation Guide

Quickstart

Check out our 10-minute quickstart!

In this example, we load images from an AWS S3 bucket’s URLs and resize each image in the dataframe:

import daft

# Load a dataframe from filepaths in an S3 bucket
df = daft.from_glob_path("s3://daft-public-data/laion-sample-images/*")

# 1. Download column of image URLs as a column of bytes
# 2. Decode the column of bytes into a column of images
df = df.with_column("image", df["path"].url.download().image.decode())

# Resize each image into 32x32
df = df.with_column("resized", df["image"].image.resize(32, 32))

df.show(3)

Dataframe code to load a folder of images from AWS S3 and create thumbnails

Benchmarks

Benchmarks for SF100 TPCH

To see the full benchmarks, detailed setup, and logs, check out our benchmarking page.

More Resources

  • 10-minute tour of Daft - learn more about Daft’s full range of capabilities including dataloading from URLs, joins, user-defined functions (UDF), groupby, aggregations and more.

  • User Guide - take a deep-dive into each topic within Daft

  • API Reference - API reference for public classes/functions of Daft

Contributing

To start contributing to Daft, please read CONTRIBUTING.md

Here’s a list of good first issues to get yourself warmed up with Daft. Comment in the issue to pick it up, and feel free to ask any questions!

Telemetry

To help improve Daft, we collect non-identifiable data.

To disable this behavior, set the following environment variable: DAFT_ANALYTICS_ENABLED=0

The data that we collect is:

  1. Non-identifiable: events are keyed by a session ID which is generated on import of Daft

  2. Metadata-only: we do not collect any of our users’ proprietary code or data

  3. For development only: we do not buy or sell any user data

Please see our documentation for more details.

https://static.scarf.sh/a.png?x-pxid=cd444261-469e-473b-b9ba-f66ac3dc73ee

License

Daft has an Apache 2.0 license - please see the LICENSE file.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

getdaft-0.3.4.tar.gz (3.6 MB view details)

Uploaded Source

Built Distributions

getdaft-0.3.4-cp38-abi3-win_amd64.whl (27.9 MB view details)

Uploaded CPython 3.8+ Windows x86-64

getdaft-0.3.4-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (30.5 MB view details)

Uploaded CPython 3.8+ manylinux: glibc 2.17+ x86-64

getdaft-0.3.4-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (29.2 MB view details)

Uploaded CPython 3.8+ manylinux: glibc 2.17+ ARM64

getdaft-0.3.4-cp38-abi3-macosx_11_0_arm64.whl (25.4 MB view details)

Uploaded CPython 3.8+ macOS 11.0+ ARM64

getdaft-0.3.4-cp38-abi3-macosx_10_12_x86_64.whl (27.5 MB view details)

Uploaded CPython 3.8+ macOS 10.12+ x86-64

File details

Details for the file getdaft-0.3.4.tar.gz.

File metadata

  • Download URL: getdaft-0.3.4.tar.gz
  • Upload date:
  • Size: 3.6 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.10

File hashes

Hashes for getdaft-0.3.4.tar.gz
Algorithm Hash digest
SHA256 df31014ffbf5a6441103015ee5659f129e40bab8847f610e280c3a1ecef74150
MD5 1296458b2d206e550a6f29735390fffe
BLAKE2b-256 acd038ce29793df926e202711b2a6ddf88f7c6fb0aebd1cd63d698ba28779272

See more details on using hashes here.

File details

Details for the file getdaft-0.3.4-cp38-abi3-win_amd64.whl.

File metadata

  • Download URL: getdaft-0.3.4-cp38-abi3-win_amd64.whl
  • Upload date:
  • Size: 27.9 MB
  • Tags: CPython 3.8+, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.10

File hashes

Hashes for getdaft-0.3.4-cp38-abi3-win_amd64.whl
Algorithm Hash digest
SHA256 6c976e1450de0b9e70c842c8282046ed07822c501478d6444bc9ed65dbe00413
MD5 92cfa71ad9bcc7cb7cc27303aa3af671
BLAKE2b-256 129d3b750e758fd861703d587975d5a6330f7947681cb2ae9195ea935f48cb3d

See more details on using hashes here.

File details

Details for the file getdaft-0.3.4-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.3.4-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 1898d689736f7113894d50b98130dc43e2aa25941da6832a5c2d1757a0230559
MD5 d10746ab53374e4b91843ce80c25e600
BLAKE2b-256 386351d19874197b0816be919eef1f6c5134385adb7cf6d421fb55c5627c6063

See more details on using hashes here.

File details

Details for the file getdaft-0.3.4-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for getdaft-0.3.4-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 733f350c9162a413b2a1343017eeffc86c7c67036b6c770044a342c496e9f15c
MD5 0aae3876d18a89b4edcabc190c386226
BLAKE2b-256 aa5ffc1f8753e2d730f5258868fcaef24c4d1c5e1d9ece8c685dc9b5ea987639

See more details on using hashes here.

File details

Details for the file getdaft-0.3.4-cp38-abi3-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for getdaft-0.3.4-cp38-abi3-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 8ecb2ca95b89e2717166793c291f0b91e6c4bd60343afe9db20aeb069e194a4d
MD5 2ff92dd9e3aa2593e6208fe0b086adde
BLAKE2b-256 c66db8ee12774e7f95a505f0666d03b235a72972a0ac9f4f7c95d1fd740c565b

See more details on using hashes here.

File details

Details for the file getdaft-0.3.4-cp38-abi3-macosx_10_12_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.3.4-cp38-abi3-macosx_10_12_x86_64.whl
Algorithm Hash digest
SHA256 5efbf3ada7c43aacfd80a8b2e40d4f3ff875dea8ae99668f70a1fe705870de13
MD5 4a6a1483d1bcd1bc9541aa661b6d926e
BLAKE2b-256 145bf4859dd3b89dc634b33d63be1169d5b124d52ccbed486666ce7be5478f2b

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page