Skip to main content

A data quality check module for Spark

Project description

dq_check

Overview

dq_check is a Python package that provides a data quality check function encapsulated in the DQCheck class. It allows you to perform data quality checks on tables using SQL queries and save the results into a Delta table for auditing purposes.

Features

  • Perform data quality checks on specified tables using SQL queries.
  • Save audit logs of data quality checks into a Delta table.
  • Handle aggregation checks and basic data quality metrics.
  • Supports PySpark and Pandas integration.

Installation

You can install dq_check from PyPI using pip:

bash

pip install dq_check

Usage

Here's an example of how to use the DQCheck class from the dq_check package:

from pyspark.sql import SparkSession

from dq_check import DQCheck

Initialize Spark session

spark = SparkSession.builder.appName("DQCheckExample").getOrCreate()

Create an instance of DQCheck

dq_checker = DQCheck(spark,audit_table_name) #audit table name should have catalog and schema.

spark (SparkSession): The Spark session.

audit_table_name (str):Default is audit_log. The name of the Delta table to store audit logs.

azure_sql_client:Default is None. This is required for asql,create azure_sql_client by providing scope and secret with AzureSQLClient

run_id:Default is -999 , run_id for the ADF pipeline

Define the data quality check parameters

table_type = "delta" # Type of the table ('delta' or 'asql')

table_name = "your_table_name" # Name of the table, should have catalog/schema for delta and schema for asql.

primary_keys = ["your_primary_key"] # List of primary key columns

sql_query = "SELECT * FROM your_table WHERE condition" # Data quality check query # should have table name with catalog and schema.

Perform the data quality check

dq_checker.perform_dq_check(

table_type,

table_name,

primary_keys,

sql_query,

where_clause=None, # Optional where clause for sample data

quality_threshold_percentage=5,  # Optional Quality threshold percentage

chunk_size=200, #Optional chunk size for pks list

)

Configuration

Adjust the parameters passed to the perform_dq_check method based on your requirements.

Dependencies

PySpark Pandas

Contributing

Contributions are welcome! Please feel free to submit issues and pull requests on the GitHub repository.

License

None.

Contact

For any questions or feedback, open a github issue

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dq_check-0.3.1.tar.gz (7.3 kB view details)

Uploaded Source

Built Distribution

dq_check-0.3.1-py3-none-any.whl (6.8 kB view details)

Uploaded Python 3

File details

Details for the file dq_check-0.3.1.tar.gz.

File metadata

  • Download URL: dq_check-0.3.1.tar.gz
  • Upload date:
  • Size: 7.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.9.12

File hashes

Hashes for dq_check-0.3.1.tar.gz
Algorithm Hash digest
SHA256 29c9fc2e30888557efab1d5321a767f496e19292f6c541d908377b8693902f2e
MD5 93d3355f92433a6277ae3e4c5d35a449
BLAKE2b-256 7d6d7201791a003f11f4207dde9ba13a74cbcc98d03009dc95d7396b2854275b

See more details on using hashes here.

File details

Details for the file dq_check-0.3.1-py3-none-any.whl.

File metadata

  • Download URL: dq_check-0.3.1-py3-none-any.whl
  • Upload date:
  • Size: 6.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.9.12

File hashes

Hashes for dq_check-0.3.1-py3-none-any.whl
Algorithm Hash digest
SHA256 10576ea7522223d9d47b19baf2077a8704b0eed74665ce37ce2e50b5f617fe58
MD5 cce7065059790c517b6015c32b8890f8
BLAKE2b-256 245cee510988a16c45e42bcc1dd3d660708cdc6a71243c5d01411d9377403130

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page