Skip to main content

A data quality check module for Spark

Project description

dq_check

Overview

dq_check is a Python package that provides a data quality check function encapsulated in the DQCheck class. It allows you to perform data quality checks on tables using SQL queries and save the results into a Delta table for auditing purposes.

Features

  • Perform data quality checks on specified tables using SQL queries.
  • Save audit logs of data quality checks into a Delta table.
  • Handle aggregation checks and basic data quality metrics.
  • Supports PySpark and Pandas integration.

Installation

You can install dq_check from PyPI using pip:

bash

pip install dq_check

Usage

Here's an example of how to use the DQCheck class from the dq_check package:

from pyspark.sql import SparkSession

from dq_check import DQCheck

Initialize Spark session

spark = SparkSession.builder.appName("DQCheckExample").getOrCreate()

Create an instance of DQCheck

dq_checker = DQCheck(spark,audit_table_name) #audit table name should have catalog and schema.

spark (SparkSession): The Spark session.

audit_table_name (str):Default is audit_log. The name of the Delta table to store audit logs.

azure_sql_client:Default is None. This is required for asql,create azure_sql_client by providing scope and secret with AzureSQLClient

run_id:Default is -999 , run_id for the ADF pipeline

Define the data quality check parameters

table_type = "delta" # Type of the table ('delta' or 'asql')

table_name = "your_table_name" # Name of the table, should have catalog/schema for delta and schema for asql.

primary_keys = ["your_primary_key"] # List of primary key columns

sql_query = "SELECT * FROM your_table WHERE condition" # Data quality check query # should have table name with catalog and schema.

Perform the data quality check

dq_checker.perform_dq_check(

table_type,

table_name,

primary_keys,

sql_query,

where_clause=None, # Optional where clause for sample data

quality_threshold_percentage=5,  # Optional Quality threshold percentage

chunk_size=200, #Optional chunk size for pks list

)

Configuration

Adjust the parameters passed to the perform_dq_check method based on your requirements.

Dependencies

PySpark Pandas

Contributing

Contributions are welcome! Please feel free to submit issues and pull requests on the GitHub repository.

License

None.

Contact

For any questions or feedback, open a github issue

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dq_check-0.3.2.tar.gz (7.3 kB view details)

Uploaded Source

Built Distribution

dq_check-0.3.2-py3-none-any.whl (6.9 kB view details)

Uploaded Python 3

File details

Details for the file dq_check-0.3.2.tar.gz.

File metadata

  • Download URL: dq_check-0.3.2.tar.gz
  • Upload date:
  • Size: 7.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.9.12

File hashes

Hashes for dq_check-0.3.2.tar.gz
Algorithm Hash digest
SHA256 c50629a1d364a25eca9e573f1c71934ed0dc1be974f1995da2e405663ed5e4a0
MD5 998991bffb599097f20d0d3f8f0dc265
BLAKE2b-256 54ce64d5f41e6598f2b76bd2fa5532da6ed7ef284b8a086184d1196a685c80ba

See more details on using hashes here.

File details

Details for the file dq_check-0.3.2-py3-none-any.whl.

File metadata

  • Download URL: dq_check-0.3.2-py3-none-any.whl
  • Upload date:
  • Size: 6.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.9.12

File hashes

Hashes for dq_check-0.3.2-py3-none-any.whl
Algorithm Hash digest
SHA256 5d555c80c0262ff8406caa736bde827bb7c3f8cc48d7c4ed85293ebe64a931ab
MD5 ce2768dff196bc50bd678d64d271db1b
BLAKE2b-256 7394c96745495602cce4844ee93a637eabfc7e0484df4199dc9ecf7f303f1016

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page