Skip to main content

A Package to perform QA on data flows for Machine Learning.

Project description

MLQA

PyPI tests Codecov Documentation Status PyPI - Downloads GitHub last commit Twitter

A Package to perform QA on data flows for Machine Learning.

Introduction

MLQA is a Python package that is created to help data scientists, analysts and developers to perform quality assurance (i.e. QA) on pandas dataframes and 1d arrays, especially for machine learning modeling data flows. It's designed to work with logging library to log and notify QA steps in a descriptive way. It includes stand alone functions (i.e. checkers) for different QA activities and DiffChecker class for integrated QA capabilities on data.

Installation

You can install MLQA with pip.

pip install mlqa

MLQA depends on Pandas and Numpy and works in Python 3.6+.

Quickstart

DiffChecker is designed to perform QA on data flows for ML. You can easily save statistics from the origin data such as missing value rate, mean, min/max, percentile, outliers, etc., then to compare against the new data. This is especially important if you want to keep the prediction data under the same assumptions with the training data.

Below is a quick example on how it works, just initiate and save statistics from the input data.

>>> from mlqa.identifiers import DiffChecker
>>> import pandas as pd
>>> dc = DiffChecker()
>>> dc.fit(pd.DataFrame({'mean_col':[1, 2]*50, 'na_col':[None]*50+[1]*50}))

Then, you can check on new data if it's okay for given criteria. Below, you can see some data that is very similar in column mean_col but increased NA count in column na_col. The default threshold is 0.5 which means it should be okay if NA rate is 50% more than the origin data. NA rate is 50% in the origin data so up to 75% (i.e. 50*(1+0.5)) should be okay. NA rate is 70% in the new data and, as expected, the QA passes.

>>> dc.check(pd.DataFrame({'mean_col':[.99, 2.1]*50, 'na_col':[None]*70+[1]*30}))
True

See more examples at Documentation/Quickstart. You can also read the full documentation here.

Tests

Tests are written with unittest and can be located in the tests folder. There are also some tests in docstring to be run by doctest.

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mlqa-0.1.1.tar.gz (17.8 kB view details)

Uploaded Source

Built Distribution

mlqa-0.1.1-py3-none-any.whl (18.7 kB view details)

Uploaded Python 3

File details

Details for the file mlqa-0.1.1.tar.gz.

File metadata

  • Download URL: mlqa-0.1.1.tar.gz
  • Upload date:
  • Size: 17.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.6.1 requests/2.25.0 setuptools/49.2.1 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.9.0

File hashes

Hashes for mlqa-0.1.1.tar.gz
Algorithm Hash digest
SHA256 0aeb8a28c55ea3c94f35e6737c2493d78a693055b71d6b03c1a30c96490e1a8c
MD5 2814ef91f87fc8d9e199f77f93289326
BLAKE2b-256 7daa8d52ec5a2796df3f5dee05ed599f883c528dcea2b3a6ac23463721431122

See more details on using hashes here.

File details

Details for the file mlqa-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: mlqa-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 18.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.6.1 requests/2.25.0 setuptools/49.2.1 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.9.0

File hashes

Hashes for mlqa-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 c58e1e16110333df5c0d3a94a0439960c7cc84b5d9cc790f4f0ae94181be93e0
MD5 2b48bb3e9fd42200826684bacf6489f8
BLAKE2b-256 7a8013a6149fda2de9628797e6e31f60c35cd64a7a94b996e50c8262ed6bb5c7

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page