Skip to main content

data linter

Project description

Data Linter

A python package to to allow automatic validation of data as part of a Data Engineering pipeline. It is designed to automate the process of moving data from Land to Raw-History as described in the ETL pipline guide

The validation is based on the goodtables package, from the fine folk at Frictionless Data. More information can be found at their website.

Installation

pip install data_linter

Usage

This package takes a yaml based config file written by the user (see example below), and validates data in the specified Land bucket against specified metadata. If the data conforms to the metadata, it is moved to the specified Raw bucket for the next step in the pipeline. Any failed checks are passed to a separate bucket for testing. The package also generates logs to allow you to explore issues in more detail.

To run the validation, at most simple you can use the following:

from data_linter import run_validation

config_path = "config.yaml"

run_validation(config_path)

Example config file

land-base-path: s3://land-bucket/my-folder/  # Where to get the data from
fail-base-path: s3://fail-bucket/my-folder/  # Where to write the data if failed
pass-base-path: s3://pass-bucket/my-folder/  # Where to write the data if passed
log-base-path: s3://log-bucket/my-folder/  # Where to write logs
compress-data: true  # Compress data when moving elsewhere
remove-tables-on-pass: true  # Delete the tables in land if validation passes
all-must-pass: true  # Only move data if all tables have passed
fail-unknown-files:
    exceptions:
        - additional_file.txt
        - another_additional_file.txt

# Tables to validate
tables:
    table1:
        required: true  # Does the table have to exist
        pattern: null  # Assumes file is called table1
        metadata: meta_data/table1.json
        linter: goodtables

    table2:
        required: true
        pattern: ^table2
        metadata: meta_data/table2.json

How to update

We have tests that run on the current state of the poetry.lock file (i.e. the current dependencies). We also run tests based on the most up to date dependencies allowed in pyproject.toml. This allows us to see if there will be any issues when updating dependences. These can be run locally in the tests folder.

When updating this package, make sure to change the version number in pyproject.toml and describe the change in CHANGELOG.md.

If you have changed any dependencies in pyproject.toml, run poetry update to update poetry.lock.

Once you have created a release in GitHub, to publish the latest version to PyPI, run:

poetry build
poetry publish -u <username>

Here, you should substitute for your PyPI username. In order to publish to PyPI, you must be an owner of the project.

Process Diagram

How logic works

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

data_linter-2.0.0.tar.gz (12.4 kB view details)

Uploaded Source

Built Distribution

data_linter-2.0.0-py3-none-any.whl (12.9 kB view details)

Uploaded Python 3

File details

Details for the file data_linter-2.0.0.tar.gz.

File metadata

  • Download URL: data_linter-2.0.0.tar.gz
  • Upload date:
  • Size: 12.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.0.10 CPython/3.8.5 Darwin/19.6.0

File hashes

Hashes for data_linter-2.0.0.tar.gz
Algorithm Hash digest
SHA256 3fc435a02696252750cfc586e00270b31793597cef49930d00aafbac7caab340
MD5 dd6fa8980063c0dc20d6139cbf323de5
BLAKE2b-256 f3258fe16db0c89a4761810d3d334d8fbc4b7d3063f9a3343ec869670bab7f45

See more details on using hashes here.

File details

Details for the file data_linter-2.0.0-py3-none-any.whl.

File metadata

  • Download URL: data_linter-2.0.0-py3-none-any.whl
  • Upload date:
  • Size: 12.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.0.10 CPython/3.8.5 Darwin/19.6.0

File hashes

Hashes for data_linter-2.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 5e4ddd898ac01dda07788d5aa6eb2ccfc1a16d8587367ba78cc53b15b058f537
MD5 6dacf770730223e62d54d3174e4cc4ef
BLAKE2b-256 60ad0879871b102dc22f9ac0490a060424a3104d97c5225f82144522418094ad

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page