Skip to main content
Python Software Foundation 20th Year Anniversary Fundraiser  Donate today!

A DBAPI 2.0 interface and SQLAlchemy dialect for Databricks interactive clusters.

Project description

pypi pyversions

A thin wrapper around pyhive and pyodbc for creating a DBAPI connection to Databricks Workspace and SQL Analytics clusters. SQL Analytics clusters require the Simba ODBC driver.

Also provides SQLAlchemy Dialects using pyhive and pyodbc for Databricks clusters. Databricks SQL Analytics clusters only support the pyodbc-driven dialect.

Installation

Install using pip:

pip install databricks-dbapi

For SQLAlchemy support install with:

pip install databricks-dbapi[sqlalchemy]

Usage

PyHive

The connect() function returns a pyhive Hive connection object, which internally wraps a thrift connection.

Connecting with http_path, host, and a token:

import os

from databricks_dbapi import hive


token = os.environ["DATABRICKS_TOKEN"]
host = os.environ["DATABRICKS_HOST"]
http_path = os.environ["DATABRICKS_HTTP_PATH"]


connection = hive.connect(
    host=host,
    http_path=http_path,
    token=token,
)
cursor = connection.cursor()

cursor.execute("SELECT * FROM some_table LIMIT 100")

print(cursor.fetchone())
print(cursor.fetchall())

The pyhive connection also provides async functionality:

import os

from databricks_dbapi import hive
from TCLIService.ttypes import TOperationState


token = os.environ["DATABRICKS_TOKEN"]
host = os.environ["DATABRICKS_HOST"]
cluster = os.environ["DATABRICKS_CLUSTER"]


connection = hive.connect(
    host=host,
    cluster=cluster,
    token=token,
)
cursor = connection.cursor()

cursor.execute("SELECT * FROM some_table LIMIT 100", async_=True)

status = cursor.poll().operationState
while status in (TOperationState.INITIALIZED_STATE, TOperationState.RUNNING_STATE):
    logs = cursor.fetch_logs()
    for message in logs:
        print(message)

    # If needed, an asynchronous query can be cancelled at any time with:
    # cursor.cancel()

    status = cursor.poll().operationState

print(cursor.fetchall())

ODBC

The ODBC DBAPI requires the Simba ODBC driver.

Connecting with http_path, host, and a token:

import os

from databricks_dbapi import odbc


token = os.environ["DATABRICKS_TOKEN"]
host = os.environ["DATABRICKS_HOST"]
http_path = os.environ["DATABRICKS_HTTP_PATH"]


connection = odbc.connect(
    host=host,
    http_path=http_path,
    token=token,
    driver_path="/path/to/simba/driver",
)
cursor = connection.cursor()

cursor.execute("SELECT * FROM some_table LIMIT 100")

print(cursor.fetchone())
print(cursor.fetchall())

SQLAlchemy Dialects

databricks+pyhive

Installing registers the databricks+pyhive dialect/driver with SQLAlchemy. Fill in the required information when passing the engine URL.

from sqlalchemy import *
from sqlalchemy.engine import create_engine
from sqlalchemy.schema import *


engine = create_engine(
    "databricks+pyhive://token:<databricks_token>@<host>:<port>/<database>",
    connect_args={"http_path": "<cluster_http_path>"}
)

logs = Table("my_table", MetaData(bind=engine), autoload=True)
print(select([func.count("*")], from_obj=logs).scalar())

databricks+pyodbc

Installing registers the databricks+pyodbc dialect/driver with SQLAlchemy. Fill in the required information when passing the engine URL.

from sqlalchemy import *
from sqlalchemy.engine import create_engine
from sqlalchemy.schema import *


engine = create_engine(
    "databricks+pyodbc://token:<databricks_token>@<host>:<port>/<database>",
    connect_args={"http_path": "<cluster_http_path>", "driver_path": "/path/to/simba/driver"}
)

logs = Table("my_table", MetaData(bind=engine), autoload=True)
print(select([func.count("*")], from_obj=logs).scalar())

Refer to the following documentation for more details on hostname, cluster name, and http path:

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Files for databricks-dbapi, version 0.5.0
Filename, size File type Python version Upload date Hashes
Filename, size databricks_dbapi-0.5.0-py2.py3-none-any.whl (9.7 kB) File type Wheel Python version py2.py3 Upload date Hashes View
Filename, size databricks_dbapi-0.5.0.tar.gz (8.9 kB) File type Source Python version None Upload date Hashes View

Supported by

AWS AWS Cloud computing Datadog Datadog Monitoring DigiCert DigiCert EV certificate Facebook / Instagram Facebook / Instagram PSF Sponsor Fastly Fastly CDN Google Google Object Storage and Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Salesforce Salesforce PSF Sponsor Sentry Sentry Error logging StatusPage StatusPage Status page