Skip to main content

A python package to query data via amazon athena and bring it into a pandas df using aws-wrangler.

Project description

pydbtools

A package that is used to run SQL queries speficially configured for the Analytical Platform. This packages uses AWS Wrangler's Athena module but adds additional functionality (like Jinja templating, creating temporary tables) and alters some configuration to our specification.

Installation

Requires a pip release above 20.

## To install from pypi
pip install pydbtools

## Or install from git with a specific release
pip install "pydbtools @ git+https://github.com/moj-analytical-services/pydbtools@v4.0.1"

Quickstart guide

The examples directory contains more detailed notebooks demonstrating the use of this library, many of which are borrowed from the mojap-aws-tools-demo repo.

Read an SQL Athena query into a pandas dataframe

import pydbtools as pydb
df = pydb.read_sql_query("SELECT * from a_database.table LIMIT 10")

Run a query in Athena

response = pydb.start_query_execution_and_wait("CREATE DATABASE IF NOT EXISTS my_test_database")

Create a temporary table to do further separate SQL queries on later

pydb.create_temp_table("SELECT a_col, count(*) as n FROM a_database.table GROUP BY a_col", table_name="temp_table_1")
df = pydb.read_sql_query("SELECT * FROM __temp__.temp_table_1 WHERE n < 10")

pydb.dataframe_to_temp_table(my_dataframe, "my_table")
df = pydb.read_sql_query("select * from __temp__.my_table where year = 2022")

Notes

  • Amazon Athena using a flavour of SQL called trino. Docs can be found here
  • To query a date column in Athena you need to specify that your value is a date e.g. SELECT * FROM db.table WHERE date_col > date '2018-12-31'
  • To query a datetime or timestamp column in Athena you need to specify that your value is a timestamp e.g. SELECT * FROM db.table WHERE datetime_col > timestamp '2018-12-31 23:59:59'
  • Note dates and datetimes formatting used above. See more specifics around date and datetimes here
  • To specify a string in the sql query always use '' not "". Using ""'s means that you are referencing a database, table or col, etc.
  • If you are working in an environment where you cannot change the default AWS region environment variables you can set AWS_ATHENA_QUERY_REGION which will override these.
  • You can override the bucket where query results are outputted to with the ATHENA_QUERY_DUMP_BUCKET environment variable. This is mandatory if you set the region to something other than eu-west-1.

See changelog for release changes.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pydbtools-5.6.0.tar.gz (11.9 kB view details)

Uploaded Source

Built Distribution

pydbtools-5.6.0-py3-none-any.whl (12.4 kB view details)

Uploaded Python 3

File details

Details for the file pydbtools-5.6.0.tar.gz.

File metadata

  • Download URL: pydbtools-5.6.0.tar.gz
  • Upload date:
  • Size: 11.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.0 CPython/3.12.4

File hashes

Hashes for pydbtools-5.6.0.tar.gz
Algorithm Hash digest
SHA256 3fc69616d1394f411b6bdeda35a2dd3844199ebfe3e1787f4274dde927d36c01
MD5 613ab9a0271928ccd596d8a61aec6834
BLAKE2b-256 1e7cc7d756e04793f351f8c57fd8c4b60b4292a51d216b0c3aad53efc7291daa

See more details on using hashes here.

File details

Details for the file pydbtools-5.6.0-py3-none-any.whl.

File metadata

  • Download URL: pydbtools-5.6.0-py3-none-any.whl
  • Upload date:
  • Size: 12.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.0 CPython/3.12.4

File hashes

Hashes for pydbtools-5.6.0-py3-none-any.whl
Algorithm Hash digest
SHA256 f4519225ac8c1cd6a6ca181c0b5da5e0100f61d48939f3111c9daa95401e9b26
MD5 ecb9acfdeacc3a169d71ce7012c1659b
BLAKE2b-256 c02d18f3450b61d2e67b664cbfcd1a3270ff2dfcef93ad4661983f20577a3fa6

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page