Skip to main content

Build and run queries against data

Project description

DataFusion in Python

This is a Python library that binds to Apache Arrow in-memory query engine DataFusion.

Like pyspark, it allows you to build a plan through SQL or a DataFrame API against in-memory data, parquet or CSV files, run it in a multi-threaded environment, and obtain the result back in Python.

It also allows you to use UDFs and UDAFs for complex operations.

The major advantage of this library over other execution engines is that this library achieves zero-copy between Python and its execution engine: there is no cost in using UDFs, UDAFs, and collecting the results to Python apart from having to lock the GIL when running those operations.

Its query engine, DataFusion, is written in Rust, which makes strong assumptions about thread safety and lack of memory leaks.

Technically, zero-copy is achieved via the c data interface.

How to use it

Simple usage:

import datafusion
import pyarrow

# an alias
f = datafusion.functions

# create a context
ctx = datafusion.ExecutionContext()

# create a RecordBatch and a new DataFrame from it
batch = pyarrow.RecordBatch.from_arrays(
    [pyarrow.array([1, 2, 3]), pyarrow.array([4, 5, 6])],
    names=["a", "b"],
)
df = ctx.create_dataframe([[batch]])

# create a new statement
df = df.select(
    f.col("a") + f.col("b"),
    f.col("a") - f.col("b"),
)

# execute and collect the first (and only) batch
result = df.collect()[0]

assert result.column(0) == pyarrow.array([5, 7, 9])
assert result.column(1) == pyarrow.array([-3, -3, -3])

UDFs

def is_null(array: pyarrow.Array) -> pyarrow.Array:
    return array.is_null()

udf = f.udf(is_null, [pyarrow.int64()], pyarrow.bool_())

df = df.select(udf(f.col("a")))

UDAF

import pyarrow
import pyarrow.compute


class Accumulator:
    """
    Interface of a user-defined accumulation.
    """
    def __init__(self):
        self._sum = pyarrow.scalar(0.0)

    def to_scalars(self) -> [pyarrow.Scalar]:
        return [self._sum]

    def update(self, values: pyarrow.Array) -> None:
        # not nice since pyarrow scalars can't be summed yet. This breaks on `None`
        self._sum = pyarrow.scalar(self._sum.as_py() + pyarrow.compute.sum(values).as_py())

    def merge(self, states: pyarrow.Array) -> None:
        # not nice since pyarrow scalars can't be summed yet. This breaks on `None`
        self._sum = pyarrow.scalar(self._sum.as_py() + pyarrow.compute.sum(states).as_py())

    def evaluate(self) -> pyarrow.Scalar:
        return self._sum


df = ...

udaf = f.udaf(Accumulator, pyarrow.float64(), pyarrow.float64(), [pyarrow.float64()])

df = df.aggregate(
    [],
    [udaf(f.col("a"))]
)

How to install (from pip)

pip install datafusion
# or
python -m pip install datafusion

How to develop

This assumes that you have rust and cargo installed. We use the workflow recommended by pyo3 and maturin.

Bootstrap:

# fetch this repo
git clone git@github.com:apache/arrow-datafusion.git
# change to python directory
cd arrow-datafusion/python
# prepare development environment (used to build wheel / install in development)
python3 -m venv venv
# activate the venv
source venv/bin/activate
# update pip itself if necessary
python -m pip install -U pip
# if python -V gives python 3.7
python -m pip install -r requirements-37.txt
# if python -V gives python 3.8/3.9/3.10
python -m pip install -r requirements.txt

Whenever rust code changes (your changes or via git pull):

# make sure you activate the venv using "source venv/bin/activate" first
maturin develop
python -m pytest

How to update dependencies

To change test dependencies, change the requirements.in and run

# install pip-tools (this can be done only once), also consider running in venv
python -m pip install pip-tools

# change requirements.in and then run
python -m piptools compile --generate-hashes -o requirements-37.txt
# or run this is you are on python 3.8/3.9/3.10
python -m piptools compile --generate-hashes -o requirements.txt

To update dependencies, run with -U

python -m piptools compile -U --generate-hashes -o requirements-310.txt

More details here

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

datafusion-0.4.0.tar.gz (488.4 kB view details)

Uploaded Source

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

datafusion-0.4.0-cp36-abi3-win_amd64.whl (5.4 MB view details)

Uploaded CPython 3.6+Windows x86-64

datafusion-0.4.0-cp36-abi3-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (6.1 MB view details)

Uploaded CPython 3.6+manylinux: glibc 2.12+ x86-64

datafusion-0.4.0-cp36-abi3-macosx_10_7_x86_64.whl (5.3 MB view details)

Uploaded CPython 3.6+macOS 10.7+ x86-64

File details

Details for the file datafusion-0.4.0.tar.gz.

File metadata

  • Download URL: datafusion-0.4.0.tar.gz
  • Upload date:
  • Size: 488.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.6.0 importlib_metadata/4.6.4 pkginfo/1.7.1 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.9.7

File hashes

Hashes for datafusion-0.4.0.tar.gz
Algorithm Hash digest
SHA256 f98aa882d78a7cd86d23641b557bffe422169b73dc387eb9dddc0e366e5970bf
MD5 eadcdd10d9d2b9ebcd0fbd9f27d64936
BLAKE2b-256 0963cdeb20469efece933828d04b9f3fe10cf77f4227cf8b195c5d42d4be00b8

See more details on using hashes here.

File details

Details for the file datafusion-0.4.0-cp36-abi3-win_amd64.whl.

File metadata

  • Download URL: datafusion-0.4.0-cp36-abi3-win_amd64.whl
  • Upload date:
  • Size: 5.4 MB
  • Tags: CPython 3.6+, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.6.0 importlib_metadata/4.6.4 pkginfo/1.7.1 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.9.7

File hashes

Hashes for datafusion-0.4.0-cp36-abi3-win_amd64.whl
Algorithm Hash digest
SHA256 1574163440edde7d6e29c443eb4867a2582d0a3f16a5775e2241eb33109cc8c9
MD5 b821e3fd288c28fe8600de491556e40e
BLAKE2b-256 c10107e50aceab63c7dedb8f82c02fc5da0749907ff4acde813e029a5188b220

See more details on using hashes here.

File details

Details for the file datafusion-0.4.0-cp36-abi3-manylinux_2_12_x86_64.manylinux2010_x86_64.whl.

File metadata

File hashes

Hashes for datafusion-0.4.0-cp36-abi3-manylinux_2_12_x86_64.manylinux2010_x86_64.whl
Algorithm Hash digest
SHA256 4095406a2d5fce4713c24ecf6e583cf9468c31e7c978c5889ff120e5315aee5c
MD5 5e42344231d2b79c18475adabf3ec64b
BLAKE2b-256 5e4c26236f37f313afc9aa8d0edb930b76448278ba41308ad1bd6367d2ed7839

See more details on using hashes here.

File details

Details for the file datafusion-0.4.0-cp36-abi3-macosx_10_7_x86_64.whl.

File metadata

  • Download URL: datafusion-0.4.0-cp36-abi3-macosx_10_7_x86_64.whl
  • Upload date:
  • Size: 5.3 MB
  • Tags: CPython 3.6+, macOS 10.7+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.6.0 importlib_metadata/4.6.4 pkginfo/1.7.1 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.9.7

File hashes

Hashes for datafusion-0.4.0-cp36-abi3-macosx_10_7_x86_64.whl
Algorithm Hash digest
SHA256 0cf092d66eee209adc3ab56fd3c5a5aea66629a6ef355c144580de13b1a7a0a4
MD5 11e726d822bea3830e20adc9dce986f0
BLAKE2b-256 1b1433f1fc27d6203ea74aff5368e04686818b12b9c240d42248395037e731e7

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page