data profiling monitoring platform
Project description
data profiling monitoring platform
Build status
Test | Release |
---|---|
Package
Source | Downloads | Page | Installation Command |
---|---|---|---|
PyPi | Link | pip install pythoth |
Introduction
While the data and AI-driven culture emerge in several organizations, it is well known that there are still many challenges in creating an efficient data operation. One of the main barriers is achieving high-quality data. While more data brings more opportunities within the context of analytics and machine learning products, covering this growing range of assets with quality checks becomes a real scalability issue. So the big question is: how to create an efficient data quality service that covers as many datasets as possible, does not require a lot of manual tuning, is computationally scalable, and with results that are easy to interpret?
This project main proposal is an automated end-to-end profiling-based data quality architecture. It implements profiling metrics computation, model optimization, anomaly detection, and generation of reports with high explainability.
By employing the most recent tools for data processing and AutoML aligned with modern data platform patterns it was possible to create an easy-to-use framework to empower developers and data users to build this solution.
The Metrics Repository
The figure shows an overview of the entire flow: from the raw data to the decision-making regarding evaluating data quality.
First, in A, the raw dataset is transformed into aggregated profiling metrics by the profiler module and then saved in the Metrics Repository.
In B, all historical profiling from a given dataset is pulled and used to optimize (train, evaluate, and select the best forecast model for each metric) and score all metrics. The anomaly scoring module implements this flow. The forecasts, scorings (errors), and optimizations for each metric are saved back to Metrics Repository.
Lastly, flow C, which is implemented by the quality assessment module, pulls the anomaly scorings for the latest data point and triggers a warning depending on the tolerance threshold found in the optimization, alerting the dataset owner about possible quality issues in the latest batch of data.
Monitor data profiling with simple commands! 🧐
import thoth as th
# init the Metrics Repository database
th.init_db(clear=True)
# profile the historical data, register the dataset in the Metrics Repository and
# optimize ML models for all profiling time series.
th.profile_create_optimize(
df=history_df, # all your historical data
dataset_uri="temperatures", # identification for the dataset
ts_column="ts", # timestamp partition column
session=session, # sql session
spark=spark, # spark session
)
# assessing data quality for a new batch of data
th.assess_new_ts(
df=new_batch_df,
ts=datetime.datetime(1981, 12, 30),
dataset_uri="temperatures",
session=session
)
If a anomaly is detected for a new batch of data, this is the log you will receive
2022-10-20 14:44:20.959 | INFO | thoth.quality:assess_quality:90 - 🔍️ Assessing quality for ts=1981-12-30 00:00:00 ...
2022-10-20 14:44:20.971 | ERROR | thoth.quality:assess_quality:103 - 🚨 ️Anomaly detected, notifying handlers...
2022-10-20 14:44:20.972 | ERROR | thoth.quality:_notify:75 - Anomaly detected for ts=1981-12-30 00:00:00 on dataset_uri=temperatures!
The following metrics have scores above the defined threshold by the optimization: [AnomalousScore(metric=Metric(entity='Column', instance='value', name='Completeness'), score=0.2275986301072123, threshold=0.08)].
Please check the dataset dashboard for more information: http://localhost:8501/?dataset_uri=temperatures&view=%F0%9F%92%AF+Scoring&instances=value
2022-10-20 14:44:20.973 | INFO | thoth.quality:assess_quality:110 - 🔍️ Quality assessment finished, handlers notified!
2022-10-20 14:44:20.973 | INFO | thoth.service_layer:assess_new_ts:493 - Pipeline finished!
Accessing the link in the logs (http://localhost:8501/?dataset_uri=temperatures&view=%F0%9F%92%AF+Scoring&instances=value) will redirect you to the dashboard, which explains the decision of the system.
💡 While this example showed just a warning log, is possible to configure any custom logic for the notification (like emails, slack, etc...)
Quick Start in 2 simple steps
1) Start Dashboard and database (docker compose):
make app
Now the database for the Metrics Repository should be up and running, you can also access the dashboard at http://localhost:8501. But wait ✋ You don't have any data there yet to monitor. Let's start profiling and analyzing sampling datasets to get started!
2) Test the framework with the example notebooks (docker compose)
This command will spin-up another container with a Jupyter Notebook server with all the dependencies installed so you can test the framework easily.
make notebook-examples
You can open the notebook at http://localhost:8888. You should see the examples folder, start by the first example notebook.
After running the thoth commands there, you should be able to visualize the dataset and metrics in the UI:
Development
After creating your virtual environment:
Install dependencies
make requirements
Code Style and Quality
Apply code style (black and isort)
make apply-style
Run all checks (flake8 and mypy)
make checks
Testing and Coverage
make tests
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file pythoth-0.3.0.tar.gz
.
File metadata
- Download URL: pythoth-0.3.0.tar.gz
- Upload date:
- Size: 798.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.9.16
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 10bdc2c554087c1c71ee9e24519b8c9c1c5621bf6cc4d0dddf5421cf5e088ddf |
|
MD5 | fe41dde0409a61a5f9c40f816f5514ec |
|
BLAKE2b-256 | 2e73418bc60f0c437c03790e28d6e6d001b15b316eaab9c5fb7db8fcbac8def8 |
File details
Details for the file pythoth-0.3.0-py3-none-any.whl
.
File metadata
- Download URL: pythoth-0.3.0-py3-none-any.whl
- Upload date:
- Size: 45.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.9.16
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 579897181ebaa5edf42df32608e004b15936009d782ccd544b29a8b3ca1ddf0d |
|
MD5 | e8bf062e1dbb95af67176ba5d931b898 |
|
BLAKE2b-256 | e14d5cb2cce569da4a894fdfa5ae9733d75afba5a04544077fbef2b4a287d6fb |