Skip to main content

Distributed Dataframes for Multimodal Data

Project description

Daft dataframes can load any data such as PDF documents, images, protobufs, csv, parquet and audio files into a table dataframe structure for easy querying

Github Actions tests PyPI latest tag Coverage slack community

WebsiteDocsInstallation10-minute tour of DaftCommunity and Support

Daft: Unified Engine for Data Analytics, Engineering & ML/AI

Daft is a distributed query engine for large-scale data processing using Python or SQL, implemented in Rust.

  • Familiar interactive API: Lazy Python Dataframe for rapid and interactive iteration, or SQL for analytical queries

  • Focus on the what: Powerful Query Optimizer that rewrites queries to be as efficient as possible

  • Data Catalog integrations: Full integration with data catalogs such as Apache Iceberg

  • Rich multimodal type-system: Supports multimodal types such as Images, URLs, Tensors and more

  • Seamless Interchange: Built on the Apache Arrow In-Memory Format

  • Built for the cloud: Record-setting I/O performance for integrations with S3 cloud storage

Table of Contents

About Daft

Daft was designed with the following principles in mind:

  1. Any Data: Beyond the usual strings/numbers/dates, Daft columns can also hold complex or nested multimodal data such as Images, Embeddings and Python objects efficiently with it’s Arrow based memory representation. Ingestion and basic transformations of multimodal data is extremely easy and performant in Daft.

  2. Interactive Computing: Daft is built for the interactive developer experience through notebooks or REPLs - intelligent caching/query optimizations accelerates your experimentation and data exploration.

  3. Distributed Computing: Some workloads can quickly outgrow your local laptop’s computational resources - Daft integrates natively with Ray for running dataframes on large clusters of machines with thousands of CPUs/GPUs.

Getting Started

Installation

Install Daft with pip install getdaft.

For more advanced installations (e.g. installing from source or with extra dependencies such as Ray and AWS utilities), please see our Installation Guide

Quickstart

Check out our 10-minute quickstart!

In this example, we load images from an AWS S3 bucket’s URLs and resize each image in the dataframe:

import daft

# Load a dataframe from filepaths in an S3 bucket
df = daft.from_glob_path("s3://daft-public-data/laion-sample-images/*")

# 1. Download column of image URLs as a column of bytes
# 2. Decode the column of bytes into a column of images
df = df.with_column("image", df["path"].url.download().image.decode())

# Resize each image into 32x32
df = df.with_column("resized", df["image"].image.resize(32, 32))

df.show(3)

Dataframe code to load a folder of images from AWS S3 and create thumbnails

Benchmarks

Benchmarks for SF100 TPCH

To see the full benchmarks, detailed setup, and logs, check out our benchmarking page.

More Resources

  • 10-minute tour of Daft - learn more about Daft’s full range of capabilities including dataloading from URLs, joins, user-defined functions (UDF), groupby, aggregations and more.

  • User Guide - take a deep-dive into each topic within Daft

  • API Reference - API reference for public classes/functions of Daft

Contributing

To start contributing to Daft, please read CONTRIBUTING.md

Here’s a list of good first issues to get yourself warmed up with Daft. Comment in the issue to pick it up, and feel free to ask any questions!

Telemetry

To help improve Daft, we collect non-identifiable data.

To disable this behavior, set the following environment variable: DAFT_ANALYTICS_ENABLED=0

The data that we collect is:

  1. Non-identifiable: events are keyed by a session ID which is generated on import of Daft

  2. Metadata-only: we do not collect any of our users’ proprietary code or data

  3. For development only: we do not buy or sell any user data

Please see our documentation for more details.

https://static.scarf.sh/a.png?x-pxid=cd444261-469e-473b-b9ba-f66ac3dc73ee

License

Daft has an Apache 2.0 license - please see the LICENSE file.

Project details


Release history Release notifications | RSS feed

This version

0.3.6

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

getdaft-0.3.6.tar.gz (3.7 MB view details)

Uploaded Source

Built Distributions

getdaft-0.3.6-cp38-abi3-win_amd64.whl (26.6 MB view details)

Uploaded CPython 3.8+ Windows x86-64

getdaft-0.3.6-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (29.6 MB view details)

Uploaded CPython 3.8+ manylinux: glibc 2.17+ x86-64

getdaft-0.3.6-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (28.1 MB view details)

Uploaded CPython 3.8+ manylinux: glibc 2.17+ ARM64

getdaft-0.3.6-cp38-abi3-macosx_11_0_arm64.whl (24.5 MB view details)

Uploaded CPython 3.8+ macOS 11.0+ ARM64

getdaft-0.3.6-cp38-abi3-macosx_10_12_x86_64.whl (26.6 MB view details)

Uploaded CPython 3.8+ macOS 10.12+ x86-64

File details

Details for the file getdaft-0.3.6.tar.gz.

File metadata

  • Download URL: getdaft-0.3.6.tar.gz
  • Upload date:
  • Size: 3.7 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.10

File hashes

Hashes for getdaft-0.3.6.tar.gz
Algorithm Hash digest
SHA256 01e70e629b458fb84b9c0195e55fce3344eb664682380f7f2bdbd5ed6dde9fd9
MD5 217f17890d3cc802f70521f147d9ff59
BLAKE2b-256 cac18962d17c2ac163f3c611aae1666cea5f36276a375ed708752baafb64d3a5

See more details on using hashes here.

File details

Details for the file getdaft-0.3.6-cp38-abi3-win_amd64.whl.

File metadata

  • Download URL: getdaft-0.3.6-cp38-abi3-win_amd64.whl
  • Upload date:
  • Size: 26.6 MB
  • Tags: CPython 3.8+, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.10

File hashes

Hashes for getdaft-0.3.6-cp38-abi3-win_amd64.whl
Algorithm Hash digest
SHA256 ac32e888aa5b6cde9788eee4fb7f7c7a8848867f4804d00125f9e2144a6d8615
MD5 d60f1afa59b93e0345c1a0cbd4845fec
BLAKE2b-256 12e3fb34af572f264de57195cc3dc5aefe49b1a802e661f564cf20fcd85aa4ad

See more details on using hashes here.

File details

Details for the file getdaft-0.3.6-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.3.6-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 f41d871865cd4d1684021a3fd89507cb06506791883eb472d02150e13172fa4f
MD5 53adad54b8502d4d35360bb6d96dd95c
BLAKE2b-256 851969b9a873a02d95ba4aa0191b3b3165612c9090f9ecc92d236d520f689cbf

See more details on using hashes here.

File details

Details for the file getdaft-0.3.6-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for getdaft-0.3.6-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 bf77b7bc81b7431b240545a714f8bdb0062e7239f21cc56d6e2dff2e1421b177
MD5 c7c9ea32cde319725e7c6e47e88ea3de
BLAKE2b-256 9d994e5205923be79765bfb63a1988a0a24bdaf0f7a2e2485174d92404eb7e49

See more details on using hashes here.

File details

Details for the file getdaft-0.3.6-cp38-abi3-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for getdaft-0.3.6-cp38-abi3-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 c3376050063e1b2c64c76619f4c278de1f81c924ac8d817dbbb5b3dae50a1d76
MD5 2c6f6431e3d26695808d4d7f6da70dab
BLAKE2b-256 30c90de96f30539ed04452f149e696c5307d4f1181d5c7fef8f20305127d9367

See more details on using hashes here.

File details

Details for the file getdaft-0.3.6-cp38-abi3-macosx_10_12_x86_64.whl.

File metadata

File hashes

Hashes for getdaft-0.3.6-cp38-abi3-macosx_10_12_x86_64.whl
Algorithm Hash digest
SHA256 444e117a1e9774c3e5c445664b1edbc1d6d6dd175481d70c56eeec5e0ed7eb77
MD5 4aacd0ba63a9661d1d5f1a87af0579fb
BLAKE2b-256 ab5a85ca3305a9c6354b33b8099d75678e456f4459fc8e46fb6bc251b5fd4e6d

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page