Skip to main content

Python DB API 2.0 (PEP 249) compliant wrapper for Amazon Athena JDBC driver

Project description

https://img.shields.io/pypi/pyversions/PyAthenaJDBC.svg https://circleci.com/gh/laughingman7743/PyAthenaJDBC.svg?style=shield https://codecov.io/gh/laughingman7743/PyAthenaJDBC/branch/master/graph/badge.svg https://img.shields.io/pypi/l/PyAthenaJDBC.svg

PyAthenaJDBC

PyAthenaJDBC is a Python DB API 2.0 (PEP 249) compliant wrapper for Amazon Athena JDBC driver.

Requirements

  • Python

    • CPython 2.6, 2,7, 3,4, 3.5

  • Java

    • Java >= 8

Installation

$ pip install PyAthenaJDBC

Extra packages:

Package

Install command

Version

Pandas

pip install PyAthenaJDBC[Pandas]

>=0.19.0

SQLAlchemy

pip install PyAthenaJDBC[SQLAlchemy]

>=1.0.0

Usage

Basic usage

from pyathenajdbc import connect

conn = connect(s3_staging_dir='s3://YOUR_S3_BUCKET/path/to/',
               region_name='us-west-2')
try:
    with conn.cursor() as cursor:
        cursor.execute("""
        SELECT * FROM one_row
        """)
        print(cursor.description)
        print(cursor.fetchall())
finally:
    conn.close()

Cursor iteration

from pyathenajdbc import connect

conn = connect(s3_staging_dir='s3://YOUR_S3_BUCKET/path/to/',
               region_name='us-west-2')
try:
    with conn.cursor() as cursor:
        cursor.execute("""
        SELECT * FROM many_rows LIMIT 10
        """)
        for row in cursor:
            print(row)
finally:
    conn.close()

Query with parameter

Supported DB API paramstyle is only PyFormat. PyFormat only supports named placeholders with old % operator style and parameters specify dictionary format.

from pyathenajdbc import connect

conn = connect(s3_staging_dir='s3://YOUR_S3_BUCKET/path/to/',
               region_name='us-west-2')
try:
    with conn.cursor() as cursor:
        cursor.execute("""
        SELECT col_string FROM one_row_complex
        WHERE col_string = %(param)s
        """, {'param': 'a string'})
        print(cursor.fetchall())
finally:
    conn.close()

if % character is contained in your query, it must be escaped with %% like the following:

SELECT col_string FROM one_row_complex
WHERE col_string = %(param)s OR col_string LIKE 'a%%'

SQLAlchemy

Install SQLAlchemy with pip install SQLAlchemy>=1.0.0 or pip install PyAthenaJDBC[SQLAlchemy]. Supported SQLAlchemy is 1.0.0 or higher.

import contextlib
from urllib.parse import quote_plus  # PY2: from urllib import quote_plus
from sqlalchemy.engine import create_engine
from sqlalchemy.sql.expression import select
from sqlalchemy.sql.functions import func
from sqlalchemy.sql.schema import Table, MetaData

conn_str = 'awsathena+jdbc://{access_key}:{secret_key}@athena.{region_name}.amazonaws.com:443/'\
           '{schema_name}?s3_staging_dir={s3_staging_dir}'
engine = create_engine(conn_str.format(
    access_key=quote_plus('YOUR_ACCESS_KEY'),
    secret_key=quote_plus('YOUR_SECRET_ACCESS_KEY'),
    region_name='us-west-2',
    schema_name='default',
    s3_staging_dir=quote_plus('s3://YOUR_S3_BUCKET/path/to/')))
try:
    with contextlib.closing(engine.connect()) as conn:
        many_rows = Table('many_rows', MetaData(bind=engine), autoload=True)
        print(select([func.count('*')], from_obj=many_rows).scalar())
finally:
    engine.dispose()

The connection string has the following format:

awsathena+jdbc://{access_key}:{secret_key}@athena.{region_name}.amazonaws.com:443/{schema_name}?s3_staging_dir={s3_staging_dir}&driver_path={driver_path}&...

NOTE: s3_staging_dir requires quote. If access_key, secret_key and other parameter contain special characters, quote is also required.

Pandas

Minimal example for Pandas DataFrame:

from pyathenajdbc import connect
import pandas as pd

conn = connect(access_key='YOUR_ACCESS_KEY_ID',
               secret_key='YOUR_SECRET_ACCESS_KEY',
               s3_staging_dir='s3://YOUR_S3_BUCKET/path/to/',
               region_name='us-west-2',
               jvm_path='/path/to/jvm')  # optional, as used by JPype
df = pd.read_sql("SELECT * FROM many_rows LIMIT 10", conn)

As Pandas DataFrame:

import contextlib
from pyathenajdbc import connect
from pyathenajdbc.util import as_pandas

with contextlib.closing(
        connect(s3_staging_dir='s3://YOUR_S3_BUCKET/path/to/'
                region_name='us-west-2'))) as conn:
    with conn.cursor() as cursor:
        cursor.execute("""
        SELECT * FROM many_rows
        """)
        df = as_pandas(cursor)
print(df.describe())

Examples

Redash query runner example

See examples/redash/athena.py

Credential

Support AWS CLI credentials, Instance profile credentials and Properties file credentials.

Credential Files

~/.aws/credentials

[default]
aws_access_key_id=YOUR_ACCESS_KEY_ID
aws_secret_access_key=YOUR_SECRET_ACCESS_KEY

~/.aws/config

[default]
region=us-west-2
output=json

Environment variables

$ export AWS_ACCESS_KEY_ID=YOUR_ACCESS_KEY_ID
$ export AWS_SECRET_ACCESS_KEY=YOUR_SECRET_ACCESS_KEY
$ export AWS_DEFAULT_REGION=us-west-2

Additional environment variable:

$ export AWS_ATHENA_S3_STAGING_DIR=s3://YOUR_S3_BUCKET/path/to/

Instance profile credentials

If you create an EC2 instance profile with a policy like the following and attach it to the EC2 instance, PyAthenaJDBC accesses Amazon Athena using temporary credentials.

{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Effect": "Allow",
      "Action": [
        "athena:*"
      ],
      "Resource": [
        "*"
      ]
    },
    {
      "Effect": "Allow",
      "Action": [
        "s3:GetBucketLocation",
        "s3:GetObject",
        "s3:ListBucket",
        "s3:ListBucketMultipartUploads",
        "s3:ListMultipartUploadParts",
        "s3:AbortMultipartUpload",
        "s3:CreateBucket",
        "s3:PutObject"
      ],
      "Resource": [
        "arn:aws:s3:::aws-athena-query-results-*",
        "arn:aws:s3:::YOUR_S3_STAGING_DIR",
        "arn:aws:s3:::YOUR_S3_AWESOME_LOG_DATA"
      ]
    }
  ]
}

In the connect method or connection object, you can connect by specifying at least s3_staging_dir and region_name. It is not necessary to specify access_key and secret_key.

from pyathenajdbc import connect

conn = connect(s3_staging_dir='s3://YOUR_S3_BUCKET/path/to/',
               region_name='us-west-2')

Terraform Instance profile example:

See examples/terraform/

Properties file credentials

If you create a property file of the following format and specify the path with credential_file of the connect method or connection object, PyAthenaJDBC accesses Amazon Athena using the properties file.

accessKeyId:YOUR_ACCESS_KEY_ID
secretKey:YOUR_SECRET_ACCESS_KEY
from pyathenajdbc import connect

conn = connect(credential_file='/path/to/AWSCredentials.properties',
               s3_staging_dir='s3://YOUR_S3_BUCKET/path/to/',
               region_name='us-west-2')

Testing

Depends on the AWS CLI credentials and the following environment variables:

~/.aws/credentials

[default]
aws_access_key_id=YOUR_ACCESS_KEY_ID
aws_secret_access_key=YOUR_SECRET_ACCESS_KEY

Environment variables

$ export AWS_DEFAULT_REGION=us-west-2
$ export AWS_ATHENA_S3_STAGING_DIR=s3://YOUR_S3_BUCKET/path/to/

Run test

$ pip install pytest awscli
$ scripts/upload_test_data.sh
$ py.test
$ scripts/delete_test_data.sh

Run test multiple Python versions

$ pip install tox awscli
$ scripts/upload_test_data.sh
$ pyenv local 2.6.9 2.7.12 3.4.5 3.5.2
$ tox
$ scripts/delete_test_data.sh

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

PyAthenaJDBC-1.1.1.tar.gz (15.6 kB view details)

Uploaded Source

Built Distribution

PyAthenaJDBC-1.1.1-py2.py3-none-any.whl (17.9 MB view details)

Uploaded Python 2 Python 3

File details

Details for the file PyAthenaJDBC-1.1.1.tar.gz.

File metadata

File hashes

Hashes for PyAthenaJDBC-1.1.1.tar.gz
Algorithm Hash digest
SHA256 78252f6f953d62dcbcd77a1ae966823899262d2318558f6c2932624cc3be7795
MD5 5d5a4701b364da24f3ade8fa437a411f
BLAKE2b-256 3018476ec77cf3b03ed50e82506ff201752cb85dc71e3542d7d04b470aa66330

See more details on using hashes here.

Provenance

File details

Details for the file PyAthenaJDBC-1.1.1-py2.py3-none-any.whl.

File metadata

File hashes

Hashes for PyAthenaJDBC-1.1.1-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 2321ee16c17102793832f9ebc49dedbfadc81ad9c0557f7b77cdbf1bd227b0db
MD5 476404a2b786ee0177ad057a700e8f09
BLAKE2b-256 896fef175144550d786f667d01378f51fac51b0a5ac7855dc15e1a298f1fa36d

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page