Skip to main content

A python ETL libRary (SPETLR) for Databricks powered by Apache SPark.

Project description

spetlr

A python ETL libRary (SPETLR) for Databricks powered by Apache SPark.

Visit SPETLR official webpage: https://spetlr.com/

NEWS

Support for LTS9.1 is ending. See the issue for discussions.

TransformerNC class will be removed permanently. Follow the PR.

Table of Contents

Description

SPETLR has a lot of great tools for working with ETL in Databricks. But to make it easy for you to consider why you need SPETLR here is a list of the core features:

  • ETL framework: A common ETL framework that enables reusable transformations in an object-oriented manner. Standardized structures facilitate cooperation in large teams.

  • Integration testing: A framework for creating test databases and tables before deploying to production in order to ensure reliable and stable data platforms. An additional layer of data abstraction allows full integration testing.

  • Handlers: Standard connectors with commonly used options reduce boilerplate.

For more information, visit SPETLR official webpage: https://spetlr.com/

Important Notes

This package can not be run or tested without access to pyspark. However, installing pyspark as part of our installer gave issues when other versions of pyspark were needed. Hence we took out the dependency from our installer.

Installation

Install SPETLR from PyPI: PyPI version PyPI

pip install spetlr

Development Notes

To prepare for development, please install these additional requirements:

  • Java 8
  • pip install -r test_requirements.txt

Then install the package locally

python setup.py develop

Testing

Local tests

After installing the dev-requirements, execute tests by running:

pytest tests

These tests are located in the ./tests/local folder and only require a Python interpreter. Pull requests will not be accepted if these tests do not pass. If you add new features, please include corresponding tests.

Cluster tests

Tests in the ./tests/cluster folder are designed to run on a Databricks cluster. The Pre-integration Test utilizes Azure Resource deployment - and can only be run by the spetlr-org admins.

To deploy the necessary Azure resources to your own Azure Tenant, run the following command:

.\.github\deploy\deploy.ps1 -uniqueRunId "yourUniqueId"

Be aware that the applied name for uniqueRunId should only contain lower case and numbers, and its length should not exceed 12 characters.

Afterward, execute the following commands:

.\.github\submit\build.ps1
.\.github\submit\submit_test_job.ps1

General Project Info

Github top language Github stars Github forks Github size Issues Open PyPI spetlr badge

Contributing

Feel free to contribute to SPETLR. Any contributions are appreciated - not only new features, but also if you find a way to improve SPETLR.

If you have a suggestion that can enhance SPETLR, please fork the repository and create a pull request. Alternatively, you can open an issue with the "enhancement" tag.

  1. Fork the Project
  2. Create your Feature Branch (git checkout -b feature/NewSPETLRFeature)
  3. Commit your Changes (git commit -m 'Add some SEPTLRFeature')
  4. Push to the Branch (git push origin feature/NewSPETLRFeature)
  5. Open a Pull Request

Build Status

Post-Integration

Releases

Releases to PyPI is an Github Action which needs to be manually triggered.

Release PyPI spetlr badge

Requirements and dependencies

The library has three txt-files at the root of the repo. These files defines three levels of requirements:

  • requirements_install.txt - this file contains the required libraries to be able to install spetlr.
  • requirements_test.txt - libraries required to run unit- and integration tests
  • requirements_dev.txt - libraries required in the development process in order to contribute to the repo

All libraries and their dependencies are added with a fixed version to the configuration file setup.cfg using the defined requirements from requirements_install.txt.

To upgrade the the dependencies in the setup.cfg file do the following:

  1. Create a new branch
  2. Run upgrade_requirements.ps1 in your terminal
  3. Commit the changes the script has made to the cfg file. If there are no changes, everything is up to date.
  4. The PR runs all tests and ensure that the library is compliant with any updates

Note that if it is desired to upgrade a dependency, but not to its newest version, it is possible to set the desired version in the requirements_install.txt, then this will be respected by the upgrade script.

Contact

For any inquiries, please use the SPETLR Discord Server.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

spetlr-5.1.16.tar.gz (183.2 kB view details)

Uploaded Source

Built Distribution

spetlr-5.1.16-py3-none-any.whl (165.0 kB view details)

Uploaded Python 3

File details

Details for the file spetlr-5.1.16.tar.gz.

File metadata

  • Download URL: spetlr-5.1.16.tar.gz
  • Upload date:
  • Size: 183.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.0 CPython/3.12.5

File hashes

Hashes for spetlr-5.1.16.tar.gz
Algorithm Hash digest
SHA256 89cd5dbe4f853aae9e8fe46d335e317551a20a67f9bd15032d00dc5a40a2ce72
MD5 d2858f114e003f64bc37fc8ace7388e1
BLAKE2b-256 a12fc7894c94e153e2034887448b1a6516ea31fc8d663a940835530efd2dc93d

See more details on using hashes here.

File details

Details for the file spetlr-5.1.16-py3-none-any.whl.

File metadata

  • Download URL: spetlr-5.1.16-py3-none-any.whl
  • Upload date:
  • Size: 165.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.0 CPython/3.12.5

File hashes

Hashes for spetlr-5.1.16-py3-none-any.whl
Algorithm Hash digest
SHA256 99db20d9403561e345f7ae19d83cc4c9e6390cfe740de2bf99cd2fb11a1e7cd5
MD5 932b191509ff8cdfa4d8966ee2a844e9
BLAKE2b-256 a6c045ff98c62444efa36d8f63750ce5a5764528cea1914ecf750c1ef499bff3

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page