Skip to main content

Python interface to Hive

Project description

https://travis-ci.org/dropbox/PyHive.svg?branch=master https://img.shields.io/codecov/c/github/dropbox/PyHive.svg

PyHive

PyHive is a collection of Python DB-API and SQLAlchemy interfaces for Presto and Hive.

Usage

DB-API

from pyhive import presto  # or import hive or import trino
cursor = presto.connect('localhost').cursor()
cursor.execute('SELECT * FROM my_awesome_data LIMIT 10')
print cursor.fetchone()
print cursor.fetchall()

DB-API (asynchronous)

from pyhive import hive
from TCLIService.ttypes import TOperationState
cursor = hive.connect('localhost').cursor()
cursor.execute('SELECT * FROM my_awesome_data LIMIT 10', async=True)

status = cursor.poll().operationState
while status in (TOperationState.INITIALIZED_STATE, TOperationState.RUNNING_STATE):
    logs = cursor.fetch_logs()
    for message in logs:
        print message

    # If needed, an asynchronous query can be cancelled at any time with:
    # cursor.cancel()

    status = cursor.poll().operationState

print cursor.fetchall()

In Python 3.7 async became a keyword; you can use async_ instead:

cursor.execute('SELECT * FROM my_awesome_data LIMIT 10', async_=True)

SQLAlchemy

First install this package to register it with SQLAlchemy (see setup.py).

from sqlalchemy import *
from sqlalchemy.engine import create_engine
from sqlalchemy.schema import *
# Presto
engine = create_engine('presto://localhost:8080/hive/default')
# Trino
engine = create_engine('trino://localhost:8080/hive/default')
# Hive
engine = create_engine('hive://localhost:10000/default')
logs = Table('my_awesome_data', MetaData(bind=engine), autoload=True)
print select([func.count('*')], from_obj=logs).scalar()

# Hive + HTTPS + LDAP or basic Auth
engine = create_engine('hive+https://username:password@localhost:10000/')
logs = Table('my_awesome_data', MetaData(bind=engine), autoload=True)
print select([func.count('*')], from_obj=logs).scalar()

Note: query generation functionality is not exhaustive or fully tested, but there should be no problem with raw SQL.

Passing session configuration

# DB-API
hive.connect('localhost', configuration={'hive.exec.reducers.max': '123'})
presto.connect('localhost', session_props={'query_max_run_time': '1234m'})
trino.connect('localhost',  session_props={'query_max_run_time': '1234m'})
# SQLAlchemy
create_engine(
    'presto://user@host:443/hive',
    connect_args={'protocol': 'https',
                  'session_props': {'query_max_run_time': '1234m'}}
)
create_engine(
    'trino://user@host:443/hive',
    connect_args={'protocol': 'https',
                  'session_props': {'query_max_run_time': '1234m'}}
)
create_engine(
    'hive://user@host:10000/database',
    connect_args={'configuration': {'hive.exec.reducers.max': '123'}},
)
# SQLAlchemy with LDAP
create_engine(
    'hive://user:password@host:10000/database',
    connect_args={'auth': 'LDAP'},
)

Requirements

Install using

  • pip install 'pyhive[hive]' for the Hive interface and

  • pip install 'pyhive[presto]' for the Presto interface.

  • pip install 'pyhive[trino]' for the Trino interface

PyHive works with

  • Python 2.7 / Python 3

  • For Presto: Presto install

  • For Trino: Trino install

  • For Hive: HiveServer2 daemon

Changelog

See https://github.com/dropbox/PyHive/releases.

Contributing

  • Please fill out the Dropbox Contributor License Agreement at https://opensource.dropbox.com/cla/ and note this in your pull request.

  • Changes must come with tests, with the exception of trivial things like fixing comments. See .travis.yml for the test environment setup.

  • Notes on project scope:

    • This project is intended to be a minimal Hive/Presto client that does that one thing and nothing else. Features that can be implemented on top of PyHive, such integration with your favorite data analysis library, are likely out of scope.

    • We prefer having a small number of generic features over a large number of specialized, inflexible features. For example, the Presto code takes an arbitrary requests_session argument for customizing HTTP calls, as opposed to having a separate parameter/branch for each requests option.

Testing

https://travis-ci.org/dropbox/PyHive.svg http://codecov.io/github/dropbox/PyHive/coverage.svg?branch=master

Run the following in an environment with Hive/Presto:

./scripts/make_test_tables.sh
virtualenv --no-site-packages env
source env/bin/activate
pip install -e .
pip install -r dev_requirements.txt
py.test

WARNING: This drops/creates tables named one_row, one_row_complex, and many_rows, plus a database called pyhive_test_database.

Updating TCLIService

The TCLIService module is autogenerated using a TCLIService.thrift file. To update it, the generate.py file can be used: python generate.py <TCLIServiceURL>. When left blank, the version for Hive 2.3 will be downloaded.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

acryl-PyHive-0.6.14rc1.tar.gz (49.2 kB view details)

Uploaded Source

Built Distribution

acryl_PyHive-0.6.14rc1-py3-none-any.whl (54.9 kB view details)

Uploaded Python 3

File details

Details for the file acryl-PyHive-0.6.14rc1.tar.gz.

File metadata

  • Download URL: acryl-PyHive-0.6.14rc1.tar.gz
  • Upload date:
  • Size: 49.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.8

File hashes

Hashes for acryl-PyHive-0.6.14rc1.tar.gz
Algorithm Hash digest
SHA256 aa3151ed71d10deccbb629d251721574f15d3135ff835341e91eee19d28dd3ce
MD5 c3fa55acc5e924b6f454d04a80a46b24
BLAKE2b-256 fc16551a6529aa7c98050f7340d99a450d7f3bf878eacca2b076b90874635836

See more details on using hashes here.

File details

Details for the file acryl_PyHive-0.6.14rc1-py3-none-any.whl.

File metadata

File hashes

Hashes for acryl_PyHive-0.6.14rc1-py3-none-any.whl
Algorithm Hash digest
SHA256 d06b2f7771f4b931ae93bbbce8dff748f27f061015c3bc01a1f97c8b91d87bbc
MD5 cafe9c00f5e7f9d7763233a05a1bda09
BLAKE2b-256 64667df8c5814472a9e9b8d17311bbfcac53b3b211625dac7b80b191ff149a22

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page