Blazing fast data quality framework for Python, built on Apache DataFusion
Project description
qualink
Blazing fast data quality framework for Python, built on Apache DataFusion.
Features
- High Performance: Leverages Apache DataFusion for fast data processing and validation.
- Flexible Constraints: Supports various data quality constraints including completeness, uniqueness, and custom assertions.
- YAML Configuration: Define validation suites declaratively using YAML files.
- CLI –
qualinkctl: Run YAML-driven validations from the terminal — no Python script required. - Cloud Object Stores: Read data directly from Amazon S3 (and S3-compatible services).
- Multiple Output Formats: Results can be formatted as human-readable text, JSON, or Markdown.
- Async Support: Built with asyncio for non-blocking operations.
- Easy Integration: Simple API for defining and running validation suites.
Installation
Install qualink using uv:
uv add qualink
Or using pip:
pip install qualink
Quick Start
Here's a basic example of using qualink to validate a CSV file:
import asyncio
from datafusion import SessionContext
from qualink.checks import Check, Level
from qualink.constraints import Assertion
from qualink.core import ValidationSuite
from qualink.formatters import MarkdownFormatter
async def main() -> None:
ctx = SessionContext()
ctx.register_csv("users", "examples/users.csv")
result = await (
ValidationSuite()
.on_data(ctx, "users")
.with_name("User Data Quality")
.add_check(Check.builder("Critical Checks").with_level(Level.ERROR).is_complete("user_id").build())
.add_check(
Check.builder("Data Quality")
.with_level(Level.WARNING)
.has_completeness("name", Assertion.greater_than_or_equal(0.95))
.build()
)
.run()
)
print(MarkdownFormatter().format(result))
if __name__ == "__main__":
asyncio.run(main())
YAML Configuration
You can also define validation suites using YAML files for a declarative approach:
suite:
name: "User Data Quality"
data_source:
type: csv
path: "examples/users.csv"
table_name: users
checks:
- name: "Critical Checks"
level: error
rules:
- is_complete: user_id
- is_unique: email
- has_size:
gt: 0
- name: "Data Quality"
level: warning
rules:
- has_completeness:
column: name
gte: 0.95
Run the YAML configuration:
import asyncio
from qualink.config import run_yaml
from qualink.formatters import HumanFormatter
async def main() -> None:
result = await run_yaml("path/to/your/config.yaml")
print(HumanFormatter().format(result))
if __name__ == "__main__":
asyncio.run(main())
CLI – qualinkctl
The simplest way to run a YAML validation is with qualinkctl:
# Human-readable output (default)
uv run qualinkctl checks.yaml
# JSON output
uv run qualinkctl checks.yaml -f json
# Markdown report saved to file
uv run qualinkctl checks.yaml -f markdown -o report.md
# Show all constraints (including passed) with debug logging
uv run qualinkctl checks.yaml --show-passed -v
qualinkctl exits with code 0 on success and 1 on failure, making it easy to use in CI/CD pipelines:
uv run qualinkctl checks.yaml -f json -o results.json || echo "Validation failed!"
Run uv run qualinkctl --help for a full list of options.
S3 Object Store Sources
qualink can read data directly from Amazon S3 using DataFusion's built-in AmazonS3:
suite:
name: "Cloud Data Quality"
data_sources:
- store: s3
bucket: my-data-lake
region: us-east-1
format: parquet
path: data/users.parquet
table_name: users
checks:
- name: "Completeness"
level: error
rules:
- is_complete: user_id
- is_unique: email
Credentials are read from the YAML or fall back to standard environment variables (AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, etc.).
Constraints
qualink supports the following constraint types:
- Completeness: Ensures a column has no null values or meets a minimum completeness ratio.
- Uniqueness: Checks for duplicate values in a column.
- Assertion: Custom assertions using SQL expressions.
Formatters
Results can be formatted using:
HumanFormatter: Human-readable text output.JsonFormatter: JSON format for programmatic processing.MarkdownFormatter: Markdown tables for documentation.
Benchmarks
qualink ships with a real-world benchmark suite that validates ~42 million NYC Yellow Taxi trip records (654 MB of Parquet data) through 12 check groups and 92 constraints — in under 1.5 seconds.
========================================================================
qualink Benchmark — NYC Taxi Trips
========================================================================
Parquet files : 3
Total size : 654.3 MB
Data dir : benchmarks/data
YAML config : benchmarks/nyc_taxi_validation.yaml
• data-200901.parquet (211.9 MB)
• data-201206.parquet (231.1 MB)
• data-201501.parquet (211.3 MB)
========================================================================
⏱ Running benchmark with 'human' formatter …
Verification PASSED: NYC Taxi Trips – qualink Benchmark Suite
Checks 12
Constraints 92
Passed 91
Failed 1
Skipped 0
Pass rate 98.9%
Execution time 1440 ms
Status Check Message
-------- ---------- ---------------------------------------------
[FAIL] Uniqueness Uniqueness of (id) is 0.0000, expected >= 1.0
========================================================================
Status : ✅ PASSED
Total records : 41.94M
Wall-clock : 1.455s
Checks : 12
Constraints : 92
Passed : 91
Failed : 1
Pass rate : 98.9%
Engine time : 0.02m
========================================================================
Run it yourself
# 1. Download data (parquet files from public S3)
./benchmarks/download_data.sh 3
# 2. Run the benchmark
uv run python benchmarks/run_benchmark.py
# Other output formats
uv run python benchmarks/run_benchmark.py --format markdown
uv run python benchmarks/run_benchmark.py --format json
See benchmarks/README.md for full dataset details and configuration.
Development
To set up the development environment:
git clone https://github.com/gopidesupavan/qualink.git
cd qualink
uv sync
Run tests:
uv run pytest
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
License
This project is licensed under the Apache License 2.0 - see the LICENSE file for details.
Acknowledgments
- Apache DataFusion for the query engine
- AWS Deequ for the inspiration
- Term Guard
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file qualink-0.0.2.tar.gz.
File metadata
- Download URL: qualink-0.0.2.tar.gz
- Upload date:
- Size: 34.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
488e0b3db7603ca76e9c672d1d26fb2b7b36e4f347af097e8e6376e904bf4e3c
|
|
| MD5 |
b38dfcab661a48a54ace24068d210780
|
|
| BLAKE2b-256 |
2d8aaf0597adc37454cbae4505aad2bae39c4b32f3c640b72be43fdfe3f53557
|
Provenance
The following attestation bundles were made for qualink-0.0.2.tar.gz:
Publisher:
release.yml on gopidesupavan/qualink
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
qualink-0.0.2.tar.gz -
Subject digest:
488e0b3db7603ca76e9c672d1d26fb2b7b36e4f347af097e8e6376e904bf4e3c - Sigstore transparency entry: 1007141324
- Sigstore integration time:
-
Permalink:
gopidesupavan/qualink@6e5363c6415329c7e8e22cb0b3d01053f1d9099e -
Branch / Tag:
refs/heads/main - Owner: https://github.com/gopidesupavan
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@6e5363c6415329c7e8e22cb0b3d01053f1d9099e -
Trigger Event:
workflow_dispatch
-
Statement type:
File details
Details for the file qualink-0.0.2-py3-none-any.whl.
File metadata
- Download URL: qualink-0.0.2-py3-none-any.whl
- Upload date:
- Size: 57.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
31b13df0777edfa3ee49e48273080cc26b35f63a5348dbb28382503bd34466ef
|
|
| MD5 |
a59731d1c3171267724bc98c6c044290
|
|
| BLAKE2b-256 |
19769f4d04ec175adceb1bbe2adf74c3ebfe70b708b145e52119220032aaff68
|
Provenance
The following attestation bundles were made for qualink-0.0.2-py3-none-any.whl:
Publisher:
release.yml on gopidesupavan/qualink
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
qualink-0.0.2-py3-none-any.whl -
Subject digest:
31b13df0777edfa3ee49e48273080cc26b35f63a5348dbb28382503bd34466ef - Sigstore transparency entry: 1007141335
- Sigstore integration time:
-
Permalink:
gopidesupavan/qualink@6e5363c6415329c7e8e22cb0b3d01053f1d9099e -
Branch / Tag:
refs/heads/main - Owner: https://github.com/gopidesupavan
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@6e5363c6415329c7e8e22cb0b3d01053f1d9099e -
Trigger Event:
workflow_dispatch
-
Statement type: