Skip to main content

The Apache Spark adapter plugin for dbt

Project description

dbt logo

dbt

dbt enables data analysts and engineers to transform their data using the same practices that software engineers use to build applications.

dbt is the T in ELT. Organize, cleanse, denormalize, filter, rename, and pre-aggregate the raw data in your warehouse so that it's ready for analysis.

dbt-spark

dbt-spark enables dbt to work with Apache Spark. For more information on using dbt with Spark, consult the docs.

Getting started

Review the repository README.md as most of that information pertains to dbt-spark.

Running locally

A docker-compose environment starts a Spark Thrift server and a Postgres database as a Hive Metastore backend. Note: dbt-spark now supports Spark 3.3.2.

The following command starts two docker containers:

docker-compose up -d

It will take a bit of time for the instance to start, you can check the logs of the two containers. If the instance doesn't start correctly, try the complete reset command listed below and then try start again.

Create a profile like this one:

spark_testing:
  target: local
  outputs:
    local:
      type: spark
      method: thrift
      host: 127.0.0.1
      port: 10000
      user: dbt
      schema: analytics
      connect_retries: 5
      connect_timeout: 60
      retry_all: true

Connecting to the local spark instance:

  • The Spark UI should be available at http://localhost:4040/sqlserver/
  • The endpoint for SQL-based testing is at http://localhost:10000 and can be referenced with the Hive or Spark JDBC drivers using connection string jdbc:hive2://localhost:10000 and default credentials dbt:dbt

Note that the Hive metastore data is persisted under ./.hive-metastore/, and the Spark-produced data under ./.spark-warehouse/. To completely reset you environment run the following:

docker-compose down
rm -rf ./.hive-metastore/
rm -rf ./.spark-warehouse/

Additional Configuration for MacOS

If installing on MacOS, use homebrew to install required dependencies.

brew install unixodbc

Contribute

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dbt_spark-1.10.1.tar.gz (77.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

dbt_spark-1.10.1-py3-none-any.whl (51.4 kB view details)

Uploaded Python 3

File details

Details for the file dbt_spark-1.10.1.tar.gz.

File metadata

  • Download URL: dbt_spark-1.10.1.tar.gz
  • Upload date:
  • Size: 77.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for dbt_spark-1.10.1.tar.gz
Algorithm Hash digest
SHA256 fc39d57e6bc6afc8bc75472bfd25f6d6fbfbf178c9e455e927300748270d1a5e
MD5 52108364ce421eae1b7f085b5a69cf00
BLAKE2b-256 0116fa90c03bbc83dd44b6ce3a4a06dffee224b8763a08a7bc64fe745b49e2c3

See more details on using hashes here.

Provenance

The following attestation bundles were made for dbt_spark-1.10.1.tar.gz:

Publisher: publish-oss.yml on dbt-labs/dbt-adapters

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file dbt_spark-1.10.1-py3-none-any.whl.

File metadata

  • Download URL: dbt_spark-1.10.1-py3-none-any.whl
  • Upload date:
  • Size: 51.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for dbt_spark-1.10.1-py3-none-any.whl
Algorithm Hash digest
SHA256 6904822cad26014ecdc93d52b8d288ef10c8354f58da55d33baf702b707bd9f9
MD5 e97cb966351e9b5ecaeb4dbded9e65cc
BLAKE2b-256 109cfd9429f4a5bb211a6a5d61000f03639d6f748a939b5ba46e041cc36f49a6

See more details on using hashes here.

Provenance

The following attestation bundles were made for dbt_spark-1.10.1-py3-none-any.whl:

Publisher: publish-oss.yml on dbt-labs/dbt-adapters

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page