Skip to main content

A library for data quality validation using PyDeequ.

Project description

Data Quality Validation

This package is designed for performing data quality validation using PyDeequ.
It enables users to validate the quality of their data, identifying any potential issues that may affect its suitability for processing or analysis.

Author: Ketan Kirange

Contributors: Ketan Kirange, Ajay Rahul Raja, Ruth Mifsud

This package contains tools and utilities for performing data quality checks on data files in

  • Pandas,
  • Dask, and
  • PySpark formats, leveraging libraries such as PyDeequ and SODA utilities.

These checks help ensure the integrity, accuracy, and completeness of the data, essential for robust data-driven decision-making processes.

Importance of Data Quality

Data quality plays a pivotal role in any engineering project, especially in data science, reporting, and analysis.

Here's why ensuring high data quality is crucial:

1. Reliable Insights

High-quality data leads to reliable and trustworthy insights.
When the data is accurate, complete, and consistent, data scientists and analysts can make informed decisions confidently.

2. Trustworthy Models

Data quality directly impacts the performance and reliability of machine learning models.
Models trained on low-quality data may produce biased or inaccurate predictions, leading to unreliable outcomes.

3. Effective Reporting

Quality data is fundamental for generating accurate reports and visualizations.
Analysts and stakeholders rely on these reports for understanding trends, identifying patterns, and making strategic decisions.
Poor data quality can lead to misleading reports and flawed interpretations.

4. Regulatory Compliance

In many industries, compliance with regulations such as GDPR, HIPAA, or industry-specific standards is mandatory.
Ensuring data quality is essential for meeting these regulatory requirements and avoiding potential legal consequences.

Data Quality Validation Tools

This repository provides a set of tools and utilities to perform comprehensive data quality validation on various data formats:

  • Pandas: Data quality checks for data stored in Pandas DataFrames, including checks for missing values, data types, and statistical summaries.
  • Dask: Scalable data quality checks for large-scale datasets using Dask, ensuring consistency and accuracy across distributed computing environments.
  • PySpark with PyDeequ: Integration with PyDeequ, enabling data quality validation on data processed using PySpark, including checks for schema validation, data distribution, and anomaly detection.
  • SODA Utilities: Utilities for validating data quality using SODA (Scalable Observations of Data Attributes) framework, allowing for automated quality checks and anomaly detection.

Getting Started

To get started with data quality validation using this repository, follow the instructions in the respective documentation for each tool:

Contributing

We welcome contributions from the community to enhance and expand the capabilities of this data quality validation repository.
Please refer to the contribution guidelines for more information on how to contribute.



Prerequisites:

  • Step 1: Download Java, Python, and Apache Spark.
    Having the appropriate versions is essential to run the code on a local system.

Java: Java 1.8 Archive Downloads

Python: Python 3.9.18 Release

Apache Spark: Apache Spark 3.3.0 Release

  • Step 2: Install PyDeequ in the terminal if you encounter an error related to "PyDeequ module is not installed on the machine."

How to install PyDeequ? Use the following command:
pip install pydeequ

  • step 3: Install our ‘Data Quality Validation’ python library in terminal.
    pip install data-quality-validation-pydeequ

  • step 4: To run the Data Quality Validation function, import the library as below:
    from dqv.dqv_pydeequ import DqvPydeequ

  • Step 5: Create a config file in a folder with the columns that need to be validated.
    Name the file as you wish, but remember to use the name in the DqvPydeequ function.

  • Step 6: Upload your data to S3 and save it in a new directory if you are running locally.

  • Step 7: Pass your source and target file paths in the DqvPydeequ function.

    DqvPydeequ(
         "", #config_file
         "", #source_data_path
         "") #target_data_path
    
  • Step 8: Run the file to validate.



Refer this repo to follow the structure of config file format

Git: https://github.com/dataruk/data-quality-validation

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

data-quality-validation-pydeequ-0.11.tar.gz (8.0 kB view details)

Uploaded Source

Built Distribution

File details

Details for the file data-quality-validation-pydeequ-0.11.tar.gz.

File metadata

File hashes

Hashes for data-quality-validation-pydeequ-0.11.tar.gz
Algorithm Hash digest
SHA256 e48106cdb24caa26ece0f40915481bd982f0c272f15e28ab2eaafdd16c1f03c4
MD5 6fa44bd4040186b980deba89369cc23a
BLAKE2b-256 2652ad91f3305e1f842fa2b9958fae3e843e8bed2a323354a561d1417d867141

See more details on using hashes here.

File details

Details for the file data_quality_validation_pydeequ-0.11-py3-none-any.whl.

File metadata

File hashes

Hashes for data_quality_validation_pydeequ-0.11-py3-none-any.whl
Algorithm Hash digest
SHA256 af6e54ef63c4b76105a325f716ed1c573efac5e22f51e93cdf9c759a024e88d2
MD5 22decb87bd17894bdc770c0df227d382
BLAKE2b-256 21cb7ca298460da73e7221eb2cb7ca1a055a994dbf9ee30fa18b337a6670a8b9

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page