Skip to main content

Facilitate data engineering on the Ingenii Data Platform

Project description

Ingenii Data Engineering Package

Maintainer License Contributing

Details

  • Current Version: 0.3.2

Overview

This package provides utilities for data engineering on Ingenii's Azure Data Platform. This can be both used for local development, and is used in the Ingenii Databricks Runtime.

Usage

Import the package to use the functions within.

import ingenii_data_engineering

dbt

Part of this package validates dbt schemas to ensure they are compatible with Databricks and the larger Ingenii Data Platform. This happens when a data pipeline to ingest a file is run, to make sure a file is ingested correctly. Full details of how to set up your dbt schema files in your Data Engineering repository can be found in the Ingenii Data Engineering Example repository.

Pre-processing

This package contains code to facilitate the pre-processing of files before they are ingested by the data platform. This allows users to transform any data into a form that is compatible. See details of working with pre-processing functions in the Ingenii Data Engineering Example repository.

This package also contains the code to turn the pre-processing scripts into a package, ready to be uploaded and used by the Data Platform. Once this package is installed, the command

python -m <package name> <command> <folder with pre-processing code>
python -m ingenii_data_engineering pre_processing_package pre_process

will generate a .whl file in a folder called dist/. For more details, see the Ingenii Data Engineering Example repository.

Development

Prerequisites

  1. A working knowledge of git SCM
  2. Installation of Python 3.7.3

Set up

  1. Complete the 'Getting Started > Prerequisites' section
  2. For Windows only:
  3. Run make setup: to copy the .env into place (.env-dist > .env)

Getting started

  1. Complete the 'Getting Started > Set up' section

  2. From the root of the repository, in a terminal (preferably in your IDE) run the following commands to set up a virtual environment:

    python -m venv venv
    . venv/bin/activate
    pip install -r requirements-dev.txt
    pre-commit install
    

    or for Windows:

    python -m venv venv
    . venv/Scripts/activate
    pip install -r requirements-dev.txt
    pre-commit install
    
  3. Note: if you get a permission denied error when executing the pre-commit install command you'll need to run chmod -R 775 venv/bin/ to recursively update permissions in the venv/bin/ dir

  4. The following checks are run as part of pre-commit hooks: flake8(note unit tests are not run as a hook)

Building

  1. Complete the 'Getting Started > Set up' section
  2. Run make build to create the package in ./dist
  3. Run make clean to remove dist files

Testing

  1. Complete the 'Getting Started > Set up' and 'Development' sections
  2. Run make test to run the unit tests using pytest
  3. Run flake8 to run lint checks using flake8
  4. Run make qa to run the unit tests and linting in a single command
  5. Run make qa to remove pytest files

Version History

  • 0.3.2: Further bugfix for JSON UTF-8 BOM
  • 0.3.1: Remove unnecessary functions specific to Databricks
  • 0.3.0: Create pre-processing package using the module
  • 0.2.1: Handle JSON read UTF-8 BOM
  • 0.2.0: Pre-processing happens all in the 'archive' container
  • 0.1.5: Better functionality for column names in .csv files
  • 0.1.4: Handle JSON files
  • 0.1.3: Adding pre-processing utilities
  • 0.1.2: Rearrangement and better split of work with the Databricks Runtime. Better validation
  • 0.1.1: Minor bug fixes
  • 0.1.0: dbt schema validation, pre-processing class

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ingenii_data_engineering-0.3.2.tar.gz (15.6 kB view details)

Uploaded Source

Built Distribution

ingenii_data_engineering-0.3.2-py3-none-any.whl (15.5 kB view details)

Uploaded Python 3

File details

Details for the file ingenii_data_engineering-0.3.2.tar.gz.

File metadata

  • Download URL: ingenii_data_engineering-0.3.2.tar.gz
  • Upload date:
  • Size: 15.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.2 importlib_metadata/4.8.2 pkginfo/1.8.2 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.9.13

File hashes

Hashes for ingenii_data_engineering-0.3.2.tar.gz
Algorithm Hash digest
SHA256 5f5179b566aca794cf4f8d00ace788305f1efa9b1008eb70a6f1245933a547ea
MD5 e780ca4fb7dc5865f678069bb135d9d8
BLAKE2b-256 2e365ecf72894888f13cfd5f5ad756b6516cca8cb6b4df8662bc7d7c80b42507

See more details on using hashes here.

File details

Details for the file ingenii_data_engineering-0.3.2-py3-none-any.whl.

File metadata

  • Download URL: ingenii_data_engineering-0.3.2-py3-none-any.whl
  • Upload date:
  • Size: 15.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.2 importlib_metadata/4.8.2 pkginfo/1.8.2 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.9.13

File hashes

Hashes for ingenii_data_engineering-0.3.2-py3-none-any.whl
Algorithm Hash digest
SHA256 0f0e0dd9839295461eb08558281c03500fc958e20e751f9739b87e34b128860d
MD5 0f329118e860ce36e81dd19722d22a8d
BLAKE2b-256 fe719397935d170ce33dc5527160e4c7d3b00d4d1850011e78f82eafab750593

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page