Skip to main content

CLI tool for dbt users adopting analytics engineering best practices.

Project description

dbt-coves

Maintenance PyPI version fury.io Code Style Checked with mypy Imports: isort Imports: python Build pre-commit.ci status codecov Maintainability Downloads

What is dbt-coves?

dbt-coves is a complimentary CLI tool for dbt that allows users to quickly apply Analytics Engineering best practices.

dbt-coves helps with the generation of scaffold for dbt by analyzing your data warehouse schema in Redshift, Snowflake, or Big Query and creating the necessary configuration files (sql and yml).

⚠️ dbt-coves is in alpha, make sure to test it for your dbt project version and DW before using in production

Here's the tool in action

image

Supported dbt versions

Version Status
< 1.0 ❌ Not supported
>= 1.0 ✅ Tested

Supported adapters

Feature Snowflake Redshift BigQuery
dbt project setup ✅ Tested 🕥 In progress ❌ Not tested
source model (sql) generation ✅ Tested 🕥 In progress ❌ Not tested
model properties (yml) generation ✅ Tested 🕥 In progress ❌ Not tested

Installation

pip install dbt-coves

We recommend using python virtualenvs and create one separate environment per project.

Main Features

For a complete detail of usage, please run:

dbt-coves -h
dbt-coves <command> -h

Environment setup

Setting up your environment can be done in two different ways:

dbt-coves setup all

Runs a set of checks in your local environment and helps you configure every project component properly: ssh keys, git and dbt

You can also configure individual components:

dbt-coves setup git

Set up git repository of dbt-coves project

dbt-coves setup dbt

Setup dbt within the project (delegates to dbt init)

dbt-coves setup ssh

Set up SSH Keys for dbt-coves project. Supports the argument --open_ssl_public_key which generates an extra Public Key in Open SSL format, useful for configuring certain providers (i.e. Snowflake authentication)

Models generation

dbt-coves generate <resource>

Where <resource> could be sources or properties.

Code generation tool to easily generate models and model properties based on configuration and existing data.

Supports Jinja templates to adjust how the resources are generated.

Arguments

dbt-coves generate sources supports the following args:

--sources-destination
# Where sources yml files will be generated, default: 'models/staging/{{schema}}/sources.yml'
--models-destination
# Where models sql files will be generated, default: 'models/staging/{{schema}}/{{relation}}.sql'
--model-props-destination
# Where models yml files will be generated, default: 'models/staging/{{schema}}/{{relation}}.yml'
--update-strategy
# Action to perform when a property file already exists: 'update', 'recreate', 'fail', 'ask' (per file)

Metadata

Supports the argument --metadata which allows to specify a csv file containing field types and descriptions to be inserted into the model property files.

dbt-coves generate sources --metadata metadata.csv

Metadata format:

database schema relation column key type description
raw master person name (empty) varchar The full name
raw master person name groupName varchar The group name

Extract configuration from Airbyte

dbt-coves extract airbyte

Extracts the configuration from your Airbyte sources, connections and destinations (excluding credentials) and stores it in the specified folder. The main goal of this feature is to keep track of the configuration changes in your git repo, and rollback to a specific version when needed.

Load configuration to Airbyte

dbt-coves load airbyte

Loads the Airbyte configuration generated with dbt-coves extract airbyte on an Airbyte server. Secrets folder needs to be specified separatedly. You can use git-secret to encrypt them and make them part of your git repo.

Settings

Dbt-coves could optionally read settings from .dbt_coves.yml or .dbt_coves/config.yml. A standard settings files could looke like this:

generate:
  sources:
    database: RAW
    schemas:
      - RAW
    sources_destination: "models/staging/{{schema}}/sources.yml"
    models_destination: "models/staging/{{schema}}/{{relation}}.sql"
    model_props_destination: "models/staging/{{schema}}/{{relation}}.yml"
    update_strategy: ask
    # override default templates creating source_model_props.yml and source_model.sql under this folder
    templates_folder: ".dbt_coves/templates"

extract:
  airbyte:
    path: /config/workspace/load
    host: http://airbyte-server
    port: 8001
    dbt_list_args: --exclude source:dbt_artifacts

In this example options for the generate command are provided:

schemas: List of schema names where to look for source tables

destination: Path to generated model, where schema represents the lowercased schema and relation the lowercased table name.

sources_destination: Where sources yml files will be generated

models_destination: Where models sql files will be generated

model_props_destination: Where models yml files will be generated

update_strategy: Action to perform when a property file already exists

templates_folder: Folder where source generation jinja templates are located.

Override source generation templates

Customizing generated models and model properties requires placing specific files under the templates_folder folder like these:

source_model.sql

with raw_source as (

    select *
    from {% raw %}{{{% endraw %} source('{{ relation.schema.lower() }}', '{{ relation.name.lower() }}') {% raw %}}}{% endraw %}

),

final as (

    select
{%- if adapter_name == 'SnowflakeAdapter' %}
{%- for key, cols in nested.items() %}
  {%- for col in cols %}
        {{ key }}:{{ '"' + col + '"' }}::{{ cols[col]["type"] }} as {{ cols[col]["id"] }}{% if not loop.last or columns %},{% endif %}
  {%- endfor %}
{%- endfor %}
{%- elif adapter_name == 'BigQueryAdapter' %}
{%- for key, cols in nested.items() %}
  {%- for col in cols %}
        cast({{ key }}.{{ col }} as {{ cols[col]["type"].replace("varchar", "string") }}) as {{ cols[col]["id"] }}{% if not loop.last or columns %},{% endif %}
  {%- endfor %}
{%- endfor %}
{%- elif adapter_name == 'RedshiftAdapter' %}
{%- for key, cols in nested.items() %}
  {%- for col in cols %}
        {{ key }}.{{ col }}::{{ cols[col]["type"] }} as {{ cols[col]["id"] }}{% if not loop.last or columns %},{% endif %}
  {%- endfor %}
{%- endfor %}
{%- endif %}
{%- for col in columns %}
        {{ '"' + col['name'] + '"' }} as {{ col['id'] }}{% if not loop.last %},{% endif %}
{%- endfor %}

    from raw_source

)

select * from final

source_props.yml

version: 2

sources:
  - name: {{ relation.schema.lower() }}
{%- if source_database %}
    database: {{ source_database }}
{%- endif %}
    tables:
      - name: {{ relation.name.lower() }}

source_model_props.yml

version: 2

models:
  - name: {{ model.lower() }}
    columns:
{%- for cols in nested.values() %}
  {%- for col in cols %}
      - name: {{ cols[col]["id"] }}
      {%- if cols[col]["description"] %}
        description: "{{ cols[col]['description'] }}"
      {%- endif %}
  {%- endfor %}
{%- endfor %}
{%- for col in columns %}
      - name: {{ col['id'] }}
      {%- if col['description'] %}
        description: "{{ col['description'] }}"
      {%- endif %}
{%- endfor %}

model_props.yml

version: 2

models:
  - name: {{ model.lower() }}
    columns:
{%- for col in columns %}
      - name: {{ col['id'] }}
      {%- if col['description'] %}
        description: "{{ col['description'] }}"
      {%- endif %}
{%- endfor %}

Thanks

The project main structure was inspired by dbt-sugar. Special thanks to Bastien Boutonnet for the great work done.

Authors

About

Learn more about Datacoves.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dbt_coves-1.1.1a11.tar.gz (43.3 kB view details)

Uploaded Source

Built Distribution

dbt_coves-1.1.1a11-py3-none-any.whl (50.9 kB view details)

Uploaded Python 3

File details

Details for the file dbt_coves-1.1.1a11.tar.gz.

File metadata

  • Download URL: dbt_coves-1.1.1a11.tar.gz
  • Upload date:
  • Size: 43.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.2.1 CPython/3.8.13 Linux/5.15.0-1019-azure

File hashes

Hashes for dbt_coves-1.1.1a11.tar.gz
Algorithm Hash digest
SHA256 92338672a3a6afb2d5bd1db2b36cc4682ec3266900a6f5e4753996521271f053
MD5 db412b2902ecc3e3335337d8ac4ff221
BLAKE2b-256 a7c8eae2022a2504429d1b7a6713d7baabf8c3f912fc079d8bec55833a80e93b

See more details on using hashes here.

File details

Details for the file dbt_coves-1.1.1a11-py3-none-any.whl.

File metadata

  • Download URL: dbt_coves-1.1.1a11-py3-none-any.whl
  • Upload date:
  • Size: 50.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.2.1 CPython/3.8.13 Linux/5.15.0-1019-azure

File hashes

Hashes for dbt_coves-1.1.1a11-py3-none-any.whl
Algorithm Hash digest
SHA256 eae4c90b166eede79781f82f7b28700b09529b93b5407b966e78e166ff42f003
MD5 9a7a005d7fb369df07f00dc9e8f202bf
BLAKE2b-256 d88b82a7b62dc09fc62a293727b30f0daba9c7692a8c3e22953f7eb2b18e47ad

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page