Skip to main content

Distributed Dataframes for Multimodal Data

Reason this release was yanked:

daft.context.set_runner_ray is broken, which leads to a bad experience with using Daft on Ray

Project description

Daft dataframes can load any data such as PDF documents, images, protobufs, csv, parquet and audio files into a table dataframe structure for easy querying

Github Actions tests PyPI latest tag Coverage slack community

WebsiteDocsInstallation10-minute tour of DaftCommunity and Support

Daft: Unified Engine for Data Analytics, Engineering & ML/AI

Daft is a distributed query engine for large-scale data processing using Python or SQL, implemented in Rust.

  • Familiar interactive API: Lazy Python Dataframe for rapid and interactive iteration, or SQL for analytical queries

  • Focus on the what: Powerful Query Optimizer that rewrites queries to be as efficient as possible

  • Data Catalog integrations: Full integration with data catalogs such as Apache Iceberg

  • Rich multimodal type-system: Supports multimodal types such as Images, URLs, Tensors and more

  • Seamless Interchange: Built on the Apache Arrow In-Memory Format

  • Built for the cloud: Record-setting I/O performance for integrations with S3 cloud storage

Table of Contents

About Daft

Daft was designed with the following principles in mind:

  1. Any Data: Beyond the usual strings/numbers/dates, Daft columns can also hold complex or nested multimodal data such as Images, Embeddings and Python objects efficiently with it’s Arrow based memory representation. Ingestion and basic transformations of multimodal data is extremely easy and performant in Daft.

  2. Interactive Computing: Daft is built for the interactive developer experience through notebooks or REPLs - intelligent caching/query optimizations accelerates your experimentation and data exploration.

  3. Distributed Computing: Some workloads can quickly outgrow your local laptop’s computational resources - Daft integrates natively with Ray for running dataframes on large clusters of machines with thousands of CPUs/GPUs.

Getting Started

Installation

Install Daft with pip install getdaft.

For more advanced installations (e.g. installing from source or with extra dependencies such as Ray and AWS utilities), please see our Installation Guide

Quickstart

Check out our 10-minute quickstart!

In this example, we load images from an AWS S3 bucket’s URLs and resize each image in the dataframe:

import daft

# Load a dataframe from filepaths in an S3 bucket
df = daft.from_glob_path("s3://daft-public-data/laion-sample-images/*")

# 1. Download column of image URLs as a column of bytes
# 2. Decode the column of bytes into a column of images
df = df.with_column("image", df["path"].url.download().image.decode())

# Resize each image into 32x32
df = df.with_column("resized", df["image"].image.resize(32, 32))

df.show(3)

Dataframe code to load a folder of images from AWS S3 and create thumbnails

Benchmarks

Benchmarks for SF100 TPCH

To see the full benchmarks, detailed setup, and logs, check out our benchmarking page.

More Resources

  • 10-minute tour of Daft - learn more about Daft’s full range of capabilities including dataloading from URLs, joins, user-defined functions (UDF), groupby, aggregations and more.

  • User Guide - take a deep-dive into each topic within Daft

  • API Reference - API reference for public classes/functions of Daft

Contributing

To start contributing to Daft, please read CONTRIBUTING.md

Here’s a list of good first issues to get yourself warmed up with Daft. Comment in the issue to pick it up, and feel free to ask any questions!

Telemetry

To help improve Daft, we collect non-identifiable data.

To disable this behavior, set the following environment variable: DAFT_ANALYTICS_ENABLED=0

The data that we collect is:

  1. Non-identifiable: events are keyed by a session ID which is generated on import of Daft

  2. Metadata-only: we do not collect any of our users’ proprietary code or data

  3. For development only: we do not buy or sell any user data

Please see our documentation for more details.

https://static.scarf.sh/a.png?x-pxid=cd444261-469e-473b-b9ba-f66ac3dc73ee

License

Daft has an Apache 2.0 license - please see the LICENSE file.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

getdaft-0.3.11.tar.gz (3.8 MB view details)

Uploaded Source

Built Distributions

getdaft-0.3.11-cp38-abi3-win_amd64.whl (27.5 MB view details)

Uploaded CPython 3.8+ Windows x86-64

getdaft-0.3.11-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (30.5 MB view details)

Uploaded CPython 3.8+ manylinux: glibc 2.17+ x86-64

getdaft-0.3.11-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (28.9 MB view details)

Uploaded CPython 3.8+ manylinux: glibc 2.17+ ARM64

getdaft-0.3.11-cp38-abi3-macosx_11_0_arm64.whl (25.3 MB view details)

Uploaded CPython 3.8+ macOS 11.0+ ARM64

getdaft-0.3.11-cp38-abi3-macosx_10_12_x86_64.whl (27.4 MB view details)

Uploaded CPython 3.8+ macOS 10.12+ x86-64

File details

Details for the file getdaft-0.3.11.tar.gz.

File metadata

  • Download URL: getdaft-0.3.11.tar.gz
  • Upload date:
  • Size: 3.8 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.10

File hashes

Hashes for getdaft-0.3.11.tar.gz
Algorithm Hash digest
SHA256 00d08c7880570a7d6b2481f87cb3e89505965eb4c22474f5b003f7d2a71ddeb7
MD5 12d0a17c6606e58e07a881d94284e3be
BLAKE2b-256 6f06bbe6ea0731945ddf26466076e381de61f6c9da931fc61815ba727988f7b2

See more details on using hashes here.

File details

Details for the file getdaft-0.3.11-cp38-abi3-win_amd64.whl.

File metadata

  • Download URL: getdaft-0.3.11-cp38-abi3-win_amd64.whl
  • Upload date:
  • Size: 27.5 MB
  • Tags: CPython 3.8+, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.10

File hashes

Hashes for getdaft-0.3.11-cp38-abi3-win_amd64.whl
Algorithm Hash digest
SHA256 6c13357b338cb35e1aaf952da7f4194992703631d51164772de3ddd3236cfd9b
MD5 4ea4c7f383fba8edc38188bdec7e227c
BLAKE2b-256 88664b0ff367889c97e29d91f3644754954475a55c9a161cf329ee2a8a680e6f

See more details on using hashes here.

File details

Details for the file getdaft-0.3.11-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.3.11-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 1d9e97846aa927fe88764bd9b2eb161081dcdb050fdb110ef78f5508ca0d1eb3
MD5 7851e3704d1f8160eef05b05da8d62ad
BLAKE2b-256 9399aa9c5c03982770992e11086b0b26028253419abe0bfda7e4b9e2fa964b81

See more details on using hashes here.

File details

Details for the file getdaft-0.3.11-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for getdaft-0.3.11-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 0f6b1436c4ca4d8852113067c703718014a0230b5035efdece403a4ce73f300b
MD5 116aba2f9294f48a22a55f1d547a5acf
BLAKE2b-256 a2aa5ec120a9cf66bf1fda54acc2b550a0b2dd9d402ed9bd997e01b968ae7e2f

See more details on using hashes here.

File details

Details for the file getdaft-0.3.11-cp38-abi3-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for getdaft-0.3.11-cp38-abi3-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 6731791b6ce180d5b39d3e29199d1324ddf665964f7ea885d3a598910e6b6a4c
MD5 2956b428cc343c71dfb1cb9f37352ca8
BLAKE2b-256 ddaf7cc0dcbeb57576bb2f5c7b47ff1fe9aafd03710bb4e8ae82552d9a020e37

See more details on using hashes here.

File details

Details for the file getdaft-0.3.11-cp38-abi3-macosx_10_12_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.3.11-cp38-abi3-macosx_10_12_x86_64.whl
Algorithm Hash digest
SHA256 6a555ba357d0d31801f829f891b96cdaa929f1a5da3194ed4e3b4f7af0d2d0af
MD5 be02cc65148da05da319934fdd617083
BLAKE2b-256 91e5e6ddb46857a8da7576967390d4b1b06e46f7ba647fc8266a259fb7d43f6e

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page