Skip to main content

simple data validation

Project description

data_check

data_check is a simple data validation tool. In its most basic form it will execute SQL queries and compare the results against CSV or Excel files. But there are more advanced features:

Features

Database support

data_check is tested with these databases:

  • PostgreSQL
  • MySQL
  • SQLite
  • Oracle
  • Microsoft SQL Server

Partially supported:

  • DuckDB
  • Databricks

Other databases supported by SQLAlchemy might also work.

Quickstart

You need Python 3.9 or above to run data_check. The easiest way to install data_check is via pipx:

pipx install data-check

The data_check Git repository is also a sample data_check project. Clone the repository, switch to the folder and run data_check:

git clone git@github.com:andrjas/data_check.git
cd data_check/example
data_check

This will run the tests in the checks folder using the default connection as set in data_check.yml.

See the documentation how to install data_check in different environments with additional database drivers and other usages of data_check.

Project layout

data_check has a simple layout for projects: a single configuration file and a folder with the test files. You can also organize the test files in subfolders.

data_check.yml    # The configuration file
checks/           # Default folder for data tests
    some_test.sql # SQL file with the query to run against the database
    some_test.csv # CSV file with the expected result
    subfolder/    # Tests can be nested in subfolders

CSV checks

This is the default mode when running data_check. data_check expects a SQL file and a CSV file. The SQL file will be executed against the database and the result is compared with the CSV file. If they match, the test is passed, otherwise it fails.

Pipelines

If data_check finds a file named data_check_pipeline.yml in a folder, it will treat this folder as a pipeline check. Instead of running CSV checks it will execute the steps in the YAML file.

Example project with a pipeline:

data_check.yml
checks/
    some_test.sql                # this test will run in parallel to the pipeline test
    some_test.csv
    sample_pipeline/
        data_check_pipeline.yml  # configuration for the pipeline
        data/
            my_schema.some_table.csv       # data for a table
        data2/
            some_data.csv        # other data
        some_checks/             # folder with CSV checks
            check1.sql
            check1.csl
            ...
        run_this.sql             # a SQL file that will be executed
        cleanup.sql
    other_pipeline/              # you can have multiple pipelines that will run in parallel
        data_check_pipeline.yml
        ...

The file sample_pipeline/data_check_pipeline.yml can look like this:

steps:
    # this will truncate the table my_schema.some_table and load it with the data from data/my_schema.some_table.csv
    - load: data
    # this will execute the SQL statement in run_this.sql
    - sql: run_this.sql
    # this will append the data from data2/some_data.csv to my_schema.other_table
    - load:
        file: data2/some_data.csv
        table: my_schema.other_table
        mode: append
    # this will run a python script and pass the connection name
    - cmd: "python3 /path/to/my_pipeline.py --connection {{CONNECTION}}"
    # this will run the CSV checks in the some_checks folder
    - check: some_checks

Pipeline checks and simple CSV checks can coexist in a project.

Documentation

See the documentation how to setup data_check, how to create a new project and more options.

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

data_check-0.18.0.tar.gz (42.6 kB view details)

Uploaded Source

Built Distribution

data_check-0.18.0-py3-none-any.whl (66.8 kB view details)

Uploaded Python 3

File details

Details for the file data_check-0.18.0.tar.gz.

File metadata

  • Download URL: data_check-0.18.0.tar.gz
  • Upload date:
  • Size: 42.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.2

File hashes

Hashes for data_check-0.18.0.tar.gz
Algorithm Hash digest
SHA256 27be7080072124a183622d04718ad4eee01c4177409e1cd050ead02392916f36
MD5 da06644a8936c36457bc6bdcc7662693
BLAKE2b-256 658e16ccc8deaf5bda3a19dc76b7c157296edfc39d5aea474e0d16c79b4167ca

See more details on using hashes here.

File details

Details for the file data_check-0.18.0-py3-none-any.whl.

File metadata

  • Download URL: data_check-0.18.0-py3-none-any.whl
  • Upload date:
  • Size: 66.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.2

File hashes

Hashes for data_check-0.18.0-py3-none-any.whl
Algorithm Hash digest
SHA256 bb1d23659efcf9f85e47efe97191241f1c6088696cad161e291109fb3db16740
MD5 cbd40030e2970a2b8c33c010d5f0a4fa
BLAKE2b-256 83a6a3f033b38c80a4ef8f4542f3152be256130c8b6c4e62c1b1ea61f57afc44

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page