Skip to main content

Read the data of an ODBC data source as sequence of Apache Arrow record batches.

Project description

arrow-odbc-py

Licence PyPI version Documentation Status

Fill Apache Arrow arrays from ODBC data sources. This package is build on top of the pyarrow Python package and arrow-odbc Rust crate and enables you to read the data of an ODBC data source as sequence of Apache Arrow record batches.

This package can also be used to insert data in Arrow record batches to database tables.

This package has been designed to be easily deployable, so it provides a prebuild many linux wheel which is independent of the specific version of your Python interpreter and the specific Arrow Version you want to use. It will dynamically link against the ODBC driver manager provided by your system.

Users looking for more features than just bulk fetching/inserting data from/into ODBC data sources in Python should also take a look at turbodbc which has a helpful community and seen a lot of battle testing. This Python package is more narrow in Scope (which is a fancy way of saying it has less features), as it is only concerned with bulk fetching Arrow Arrays. arrow-odbc may have less features than turbodbc, but it is easier to install and more resilient to version changes in pyarrow, since it is independent of C++ ABI, system dependencies (with the exeception of your ODBC driver manager of course) and your specific Python ABI. It also offers pre build wheels windows, linux and OS-X on pypi. In addition to that there is also a conda-forge recipie (thanks to @timkpaine).

About Arrow

Apache Arrow defines a language-independent columnar memory format for flat and hierarchical data, organized for efficient analytic operations on modern hardware like CPUs and GPUs. The Arrow memory format also supports zero-copy reads for lightning-fast data access without serialization overhead.

About ODBC

ODBC (Open DataBase Connectivity) is a standard which enables you to access data from a wide variaty of data sources using SQL.

Usage

Query

from arrow_odbc import read_arrow_batches_from_odbc

connection_string="Driver={ODBC Driver 17 for SQL Server};Server=localhost;"

reader = read_arrow_batches_from_odbc(
    query=f"SELECT * FROM MyTable WHERE a=?",
    connection_string=connection_string,
    batch_size=1000,
    parameters=["I'm a positional query parameter"],
    user="SA",
    password="My@Test@Password",
)

for batch in reader:
    # Process arrow batches
    df = batch.to_pandas()
    # ...

Insert

from arrow_odbc import insert_into_table
import pyarrow as pa
import pandas


def dataframe_to_table(df):
    table = pa.Table.from_pandas(df)
    reader = pa.RecordBatchReader.from_batches(table.schema, table.to_batches())
    insert_into_table(
        connection_string=connection_string,
        user="SA",
        password="My@Test@Password",
        chunk_size=1000,
        table="MyTable",
        reader=reader,
    )

Installation

Installing ODBC driver manager

The provided wheels dynamically link against the driver manager, which must be provided by the system.

Windows

Nothing to do. ODBC driver manager is preinstalled.

Ubuntu

sudo apt-get install unixodbc-dev

OS-X

You can use homebrew to install UnixODBC

brew install unixodbc

Installing Rust toolchain

Note: Only required if building from source

To build from source you need to install the Rust toolchain. Installation instruction can be found here: https://www.rust-lang.org/tools/install

Installing the wheel

Wheels have been uploaded to PyPi and can be installed using pip. The wheel (including the manylinux wheel) will link against the your system ODBC driver manager at runtime. If there are no prebuild wheels for your platform, you can build the wheel from source. For this the rust toolchain must be installed.

pip install arrow-odbc

arrow-odbc utilizes cffi and the Arrow C-Interface to glue Rust and Python code together. Therefore the wheel does not need to be build against the precise version either of Python or Arrow.

Matching of ODBC to Arrow types then querying

ODBC Arrow
Numeric(p <= 38) Decimal128
Decimal(p <= 38, s >= 0) Decimal128
Integer Int32
SmallInt Int16
Real Float32
Float(p <=24) Float32
Double Float64
Float(p > 24) Float64
Date Date32
LongVarbinary Binary
Timestamp(p = 0) TimestampSecond
Timestamp(p: 1..3) TimestampMilliSecond
Timestamp(p: 4..6) TimestampMicroSecond
Timestamp(p >= 7 ) TimestampNanoSecond
BigInt Int64
TinyInt Int8
Bit Boolean
Varbinary Binary
Binary FixedSizedBinary
All others Utf8

Matching of Arrow to ODBC types then inserting

Arrow ODBC
Utf8 VarChar
Decimal128(p, s = 0) VarChar(p + 1)
Decimal128(p, s != 0) VarChar(p + 2)
Decimal128(p, s < 0) VarChar(p - s + 1)
Decimal256(p, s = 0) VarChar(p + 1)
Decimal256(p, s != 0) VarChar(p + 2)
Decimal256(p, s < 0) VarChar(p - s + 1)
Int8 TinyInt
Int16 SmallInt
Int32 Integer
Int64 BigInt
Float16 Real
Float32 Real
Float64 Double
Timestamp s Timestamp(7)
Timestamp ms Timestamp(7)
Timestamp us Timestamp(7)
Timestamp ns Timestamp(7)
Date32 Date
Date64 Date
Time32 s Time
Time32 ms VarChar(12)
Time64 us VarChar(15)
Time64 ns VarChar(16)
Binary Varbinary
FixedBinary(l) Varbinary(l)
All others Unsupported

Project details


Release history Release notifications | RSS feed

This version

1.0.0

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

arrow_odbc-1.0.0.tar.gz (41.8 kB view details)

Uploaded Source

Built Distributions

arrow_odbc-1.0.0-py3-none-win_amd64.whl (329.7 kB view details)

Uploaded Python 3 Windows x86-64

arrow_odbc-1.0.0-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (827.6 kB view details)

Uploaded Python 3 manylinux: glibc 2.17+ x86-64

arrow_odbc-1.0.0-py3-none-macosx_10_7_x86_64.whl (473.4 kB view details)

Uploaded Python 3 macOS 10.7+ x86-64

File details

Details for the file arrow_odbc-1.0.0.tar.gz.

File metadata

  • Download URL: arrow_odbc-1.0.0.tar.gz
  • Upload date:
  • Size: 41.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.2

File hashes

Hashes for arrow_odbc-1.0.0.tar.gz
Algorithm Hash digest
SHA256 0eb13ff78b2cebf22c3b70c49bc221324095223ec4f647a600d5ae32c5e6e006
MD5 235675f5d690465aced7161dd81a1327
BLAKE2b-256 553aea3ea81cc4c9a85eca3de294df2d16cc5b084f397c33e6de0fbb9c3c281b

See more details on using hashes here.

File details

Details for the file arrow_odbc-1.0.0-py3-none-win_amd64.whl.

File metadata

  • Download URL: arrow_odbc-1.0.0-py3-none-win_amd64.whl
  • Upload date:
  • Size: 329.7 kB
  • Tags: Python 3, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.2

File hashes

Hashes for arrow_odbc-1.0.0-py3-none-win_amd64.whl
Algorithm Hash digest
SHA256 35133b359f8453fd0fd749b565dfed412fd841aafaa1af59f8eaede3a8b3da21
MD5 3132820276bbaa6bf4f73505fd305e8b
BLAKE2b-256 a6d6e1a1e5537c9b39176183c1d1a09ab633e647810bf6ff085823a15c57e091

See more details on using hashes here.

File details

Details for the file arrow_odbc-1.0.0-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for arrow_odbc-1.0.0-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 5e7a4e1eb921879f9399d77650bdbd5bcc602c2952cc2f43be20e6de0184068f
MD5 377e13800ba777f741e756eae1a766b2
BLAKE2b-256 5c165fa0b11a67c69855bf49db4d3ef6e3c90be53dbca09f1be5564fc20315bb

See more details on using hashes here.

File details

Details for the file arrow_odbc-1.0.0-py3-none-macosx_10_7_x86_64.whl.

File metadata

File hashes

Hashes for arrow_odbc-1.0.0-py3-none-macosx_10_7_x86_64.whl
Algorithm Hash digest
SHA256 97554be9c82a18623fa86d7999162f0dec617bf2a0bfecee31a862ce02dad32c
MD5 856011bd639ad42d071591b4709bdbac
BLAKE2b-256 3d2ecfd8d86f506377980c7d0aa5a70c1e04ad0deb65cb63c3a301d752828c2e

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page