Skip to main content

Distributed Dataframes for Multimodal Data

Project description

Daft dataframes can load any data such as PDF documents, images, protobufs, csv, parquet and audio files into a table dataframe structure for easy querying

Github Actions tests PyPI latest tag Coverage slack community

WebsiteDocsInstallation10-minute tour of DaftCommunity and Support

Daft: Unified Engine for Data Analytics, Engineering & ML/AI

Daft is a distributed query engine for large-scale data processing using Python or SQL, implemented in Rust.

  • Familiar interactive API: Lazy Python Dataframe for rapid and interactive iteration, or SQL for analytical queries

  • Focus on the what: Powerful Query Optimizer that rewrites queries to be as efficient as possible

  • Data Catalog integrations: Full integration with data catalogs such as Apache Iceberg

  • Rich multimodal type-system: Supports multimodal types such as Images, URLs, Tensors and more

  • Seamless Interchange: Built on the Apache Arrow In-Memory Format

  • Built for the cloud: Record-setting I/O performance for integrations with S3 cloud storage

Table of Contents

About Daft

Daft was designed with the following principles in mind:

  1. Any Data: Beyond the usual strings/numbers/dates, Daft columns can also hold complex or nested multimodal data such as Images, Embeddings and Python objects efficiently with it’s Arrow based memory representation. Ingestion and basic transformations of multimodal data is extremely easy and performant in Daft.

  2. Interactive Computing: Daft is built for the interactive developer experience through notebooks or REPLs - intelligent caching/query optimizations accelerates your experimentation and data exploration.

  3. Distributed Computing: Some workloads can quickly outgrow your local laptop’s computational resources - Daft integrates natively with Ray for running dataframes on large clusters of machines with thousands of CPUs/GPUs.

Getting Started

Installation

Install Daft with pip install getdaft.

For more advanced installations (e.g. installing from source or with extra dependencies such as Ray and AWS utilities), please see our Installation Guide

Quickstart

Check out our 10-minute quickstart!

In this example, we load images from an AWS S3 bucket’s URLs and resize each image in the dataframe:

import daft

# Load a dataframe from filepaths in an S3 bucket
df = daft.from_glob_path("s3://daft-public-data/laion-sample-images/*")

# 1. Download column of image URLs as a column of bytes
# 2. Decode the column of bytes into a column of images
df = df.with_column("image", df["path"].url.download().image.decode())

# Resize each image into 32x32
df = df.with_column("resized", df["image"].image.resize(32, 32))

df.show(3)

Dataframe code to load a folder of images from AWS S3 and create thumbnails

Benchmarks

Benchmarks for SF100 TPCH

To see the full benchmarks, detailed setup, and logs, check out our benchmarking page.

More Resources

  • 10-minute tour of Daft - learn more about Daft’s full range of capabilities including dataloading from URLs, joins, user-defined functions (UDF), groupby, aggregations and more.

  • User Guide - take a deep-dive into each topic within Daft

  • API Reference - API reference for public classes/functions of Daft

Contributing

To start contributing to Daft, please read CONTRIBUTING.md

Here’s a list of good first issues to get yourself warmed up with Daft. Comment in the issue to pick it up, and feel free to ask any questions!

Telemetry

To help improve Daft, we collect non-identifiable data.

To disable this behavior, set the following environment variable: DAFT_ANALYTICS_ENABLED=0

The data that we collect is:

  1. Non-identifiable: events are keyed by a session ID which is generated on import of Daft

  2. Metadata-only: we do not collect any of our users’ proprietary code or data

  3. For development only: we do not buy or sell any user data

Please see our documentation for more details.

https://static.scarf.sh/a.png?x-pxid=cd444261-469e-473b-b9ba-f66ac3dc73ee

License

Daft has an Apache 2.0 license - please see the LICENSE file.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

getdaft-0.3.13.tar.gz (3.9 MB view details)

Uploaded Source

Built Distributions

getdaft-0.3.13-cp38-abi3-win_amd64.whl (30.1 MB view details)

Uploaded CPython 3.8+ Windows x86-64

getdaft-0.3.13-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (33.1 MB view details)

Uploaded CPython 3.8+ manylinux: glibc 2.17+ x86-64

getdaft-0.3.13-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (31.5 MB view details)

Uploaded CPython 3.8+ manylinux: glibc 2.17+ ARM64

getdaft-0.3.13-cp38-abi3-macosx_11_0_arm64.whl (27.7 MB view details)

Uploaded CPython 3.8+ macOS 11.0+ ARM64

getdaft-0.3.13-cp38-abi3-macosx_10_12_x86_64.whl (29.9 MB view details)

Uploaded CPython 3.8+ macOS 10.12+ x86-64

File details

Details for the file getdaft-0.3.13.tar.gz.

File metadata

  • Download URL: getdaft-0.3.13.tar.gz
  • Upload date:
  • Size: 3.9 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.10

File hashes

Hashes for getdaft-0.3.13.tar.gz
Algorithm Hash digest
SHA256 d0cbb2e463af5b628c18cd7e182c21dee7f50f5f9d4fe93b4a2ebeb52593e928
MD5 7590afcfa7664f4847525b64704d9158
BLAKE2b-256 39189660891c077d3b771cb57e93408fa475014018cd6c7335aa3abcbb79619d

See more details on using hashes here.

File details

Details for the file getdaft-0.3.13-cp38-abi3-win_amd64.whl.

File metadata

  • Download URL: getdaft-0.3.13-cp38-abi3-win_amd64.whl
  • Upload date:
  • Size: 30.1 MB
  • Tags: CPython 3.8+, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.10

File hashes

Hashes for getdaft-0.3.13-cp38-abi3-win_amd64.whl
Algorithm Hash digest
SHA256 1a1cef3bf3fdffaa752f3f05994db9eda52a4d97097768aeaeb9abca1d062960
MD5 bf702319daf891331a2427b9a14f2e85
BLAKE2b-256 f6614a97e2f823e538918384a31ade99c3da47c3354f30a664af39378df2d0ff

See more details on using hashes here.

File details

Details for the file getdaft-0.3.13-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.3.13-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 58f7bfd1ed4915af020975ba9a97074c8852a0f4d55ebac5ceaced4a784b61ca
MD5 54d624714986e33cb75100c89455a3d5
BLAKE2b-256 1c20dab88809595219135ac5f6d59d8b37deb50e06f58e3fbad7031e53c858b8

See more details on using hashes here.

File details

Details for the file getdaft-0.3.13-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for getdaft-0.3.13-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 e7ff3e8a09c8647a2e6fc38bf94eabf9b7c05b8e999ffb7fb02a97bee51049f5
MD5 1c13e946620a7b39db9d654c2812dbb9
BLAKE2b-256 91ebc08e633a917e2a4c5f2e895912782216ae616a0ccecc741935d0f203a9e4

See more details on using hashes here.

File details

Details for the file getdaft-0.3.13-cp38-abi3-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for getdaft-0.3.13-cp38-abi3-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 88afa12e888bd408dcb9a6b2cda139c73b8b201baac1eddb4d25eaaefd2804a5
MD5 3a5605967ab4add3407050927396eb2a
BLAKE2b-256 dfa1d1def7b38ca8fb2d45348c12409e60566a012fe977633c1a4142dedb9cb9

See more details on using hashes here.

File details

Details for the file getdaft-0.3.13-cp38-abi3-macosx_10_12_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.3.13-cp38-abi3-macosx_10_12_x86_64.whl
Algorithm Hash digest
SHA256 3c267a563b41c0997b897c7b354f97e932bf56bf8096fb1040860d629b529cde
MD5 7205010037051395ddd30f79522544fc
BLAKE2b-256 8017dd2f9edf01dfb124d64edbb0833afdb767961728b3b0e01ec1328e92a95e

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page