Skip to main content

The athena adapter plugin for dbt (data build tool)

Project description

dbt-athena

  • Supports dbt version 1.0.*
  • Supports Seeds
  • Correctly detects views and their columns
  • Support incremental models
    • Support two incremental update strategies: insert_overwrite and append
    • Does not support the use of unique_key
  • Only supports Athena engine 2

Installation

  • pip install dbt-athena-adapter
  • Or pip install git+https://github.com/Tomme/dbt-athena.git

Prerequisites

To start, you will need an S3 bucket, for instance my-staging-bucket and an Athena database:

CREATE DATABASE IF NOT EXISTS analytics_dev
COMMENT 'Analytics models generated by dbt (development)'
LOCATION 's3://my-staging-bucket/'
WITH DBPROPERTIES ('creator'='Foo Bar', 'email'='foo@bar.com');

Notes:

  • Take note of your AWS region code (e.g. us-west-2 or eu-west-2, etc.).
  • You can also use AWS Glue to create and manage Athena databases.

Credentials

This plugin does not accept any credentials directly. Instead, credentials are determined automatically based on aws cli/boto3 conventions and stored login info. You can configure the AWS profile name to use via aws_profile_name. Checkout DBT profile configuration below for details.

Configuring your profile

A dbt profile can be configured to run against AWS Athena using the following configuration:

Option Description Required? Example
s3_staging_dir S3 location to store Athena query results and metadata Required s3://bucket/dbt/
region_name AWS region of your Athena instance Required eu-west-1
schema Specify the schema (Athena database) to build models into (lowercase only) Required dbt
database Specify the database (Data catalog) to build models into (lowercase only) Required awsdatacatalog
poll_interval Interval in seconds to use for polling the status of query results in Athena Optional 5
aws_profile_name Profile to use from your AWS shared credentials file. Optional my-profile
work_group Identifier of Athena workgroup Optional my-custom-workgroup
num_retries Number of times to retry a failing query Optional 3

Example profiles.yml entry:

athena:
  target: dev
  outputs:
    dev:
      type: athena
      s3_staging_dir: s3://athena-query-results/dbt/
      region_name: eu-west-1
      schema: dbt
      database: awsdatacatalog
      aws_profile_name: my-profile
      work_group: my-workgroup

Additional information

  • threads is supported
  • database and catalog can be used interchangeably

Usage notes

Models

Table Configuration

  • external_location (default=none)
    • The location where Athena saves your table in Amazon S3
    • If none then it will default to {s3_staging_dir}/tables
    • If you are using a static value, when your table/partition is recreated underlying data will be cleaned up and overwritten by new data
  • partitioned_by (default=none)
    • An array list of columns by which the table will be partitioned
    • Limited to creation of 100 partitions (currently)
  • bucketed_by (default=none)
    • An array list of columns to bucket data
  • bucket_count (default=none)
    • The number of buckets for bucketing your data
  • format (default='parquet')
    • The data format for the table
    • Supports ORC, PARQUET, AVRO, JSON, or TEXTFILE
  • write_compression (default=none)
    • The compression type to use for any storage format that allows compression to be specified. To see which options are available, check out CREATE TABLE AS
  • field_delimiter (default=none)
    • Custom field delimiter, for when format is set to TEXTFILE

More information: CREATE TABLE AS

Supported functionality

Support for incremental models:

  • Support two incremental update strategies with partitioned tables: insert_overwrite and append
  • Does not support the use of unique_key

Due to the nature of AWS Athena, not all core dbt functionality is supported. The following features of dbt are not implemented on Athena:

  • Snapshots

Known issues

  • Quoting is not currently supported

    • If you need to quote your sources, escape the quote characters in your source definitions:
    version: 2
    
    sources:
      - name: my_source
        tables:
          - name: first_table
            identifier: "first table"       # Not like that
          - name: second_table
            identifier: "\"second table\""  # Like this
    
  • Tables, schemas and database should only be lowercase

  • Only supports Athena engine 2

Running tests

First, install the adapter and its dependencies using make (see Makefile):

make install_deps

Next, configure the environment variables in dev.env to match your Athena development environment. Finally, run the tests using make:

make run_tests

Community

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dbt-athena-adapter-1.0.1.tar.gz (20.7 kB view details)

Uploaded Source

Built Distribution

dbt_athena_adapter-1.0.1-py3-none-any.whl (24.5 kB view details)

Uploaded Python 3

File details

Details for the file dbt-athena-adapter-1.0.1.tar.gz.

File metadata

  • Download URL: dbt-athena-adapter-1.0.1.tar.gz
  • Upload date:
  • Size: 20.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.9.13

File hashes

Hashes for dbt-athena-adapter-1.0.1.tar.gz
Algorithm Hash digest
SHA256 e92c0c3c8ef996345728740d370efd141ca67eabf83b637e5939380e99099c0b
MD5 37301cce47d6951243b48c2de93039f7
BLAKE2b-256 8c0c17610e1d35abf4e9c4b16c83b2d593babc5060f7c688aea35efd80206b2f

See more details on using hashes here.

File details

Details for the file dbt_athena_adapter-1.0.1-py3-none-any.whl.

File metadata

File hashes

Hashes for dbt_athena_adapter-1.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 d812f865780445feadbc34ab9340cb53895a0c38ade930820a6a2002e0df1a2f
MD5 ef9db5d6a2b62b9e5101dad8fe035e33
BLAKE2b-256 cbf4b78a3bee4033c2dd5ec647a63e0774ef93bf90ef2830002c002401d2e119

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page