Skip to main content

AI Verify Fairness Metrics Toolbox (FMT) for Regression contains a list of fairness metrics used to measure how resources (e.g. opportunities, food, loan, medical help) are allocated among the demographic groups (e.g. married male, married female) given a set of sensitive feature(s) (e.g. gender, marital status). This plugin is developed for regression models.

Project description

Algorithm - Fairness Metrics Toolbox for Regression

Description

  • The Fairness Metrics Toolbox (FMT) for Regression contains a list of fairness metrics used to measure how resources (e.g. opportunities, food, loan, medical help) are allocated among the demographic groups (e.g. married male, married female) given a set of sensitive feature(s) (e.g. gender, marital status). This plugin is developed for regression models.

License

  • Licensed under Apache Software License 2.0

Developers:

  • AI Verify

Installation

Each test algorithm can now be installed via pip and run individually.

pip install aiverify-fairness-metrics-toolbox-for-regression==2.0.0a1

Example Usage:

Run the following bash script to execute the plugin

#!/bin/bash

root_path="<PATH_TO_FOLDER>/aiverify/stock-plugins/user_defined_files"
python -m aiverify_fairness_metrics_toolbox_for_regression \
  --data_path $root_path/data/sample_reg_pipeline_data.sav \
  --model_path $root_path/pipeline/regression_tabular_donation \
  --ground_truth_path $root_path/data/sample_reg_pipeline_ytest_data.sav \
  --ground_truth donation \
  --model_type REGRESSION \
  --run_pipeline \
  --sensitive_features_list gender

If the algorithm runs successfully, the results of the test will be saved in an output folder.

Develop plugin locally

Execute the bash script below in the project root

#!/bin/bash

# setup virtual environment
python3 -m venv .venv
source .venv/bin/activate

# execute plugin
cd aiverify/stock-plugins/aiverify.stock.fairness-metrics-toolbox-for-regression/algorithms/fairness_metrics_toolbox_for_regression/
# install aiverify-test-engine 
pip install -e .'[dev]'

python -m aiverify_fairness_metrics_toolbox_for_regression --data_path  <data_path> --model_path <model_path> --ground_truth_path <ground_truth_path> --ground_truth <str> --model_type REGRESSION --run_pipeline --sensitive_features_list <list[str]>

Build Plugin

cd aiverify/stock-plugins/aiverify.stock.fairness-metrics-toolbox-for-regression/algorithms/fairness_metrics_toolbox_for_regression/
hatch build

Tests

Pytest is used as the testing framework.

Run the following steps to execute the unit and integration tests inside the tests/ folder

cd aiverify/stock-plugins/aiverify.stock.fairness-metrics-toolbox-for-regression/algorithms/fairness_metrics_toolbox_for_regression/
pytest .

Run using Docker

In the aiverify root directory, run the below command to build the docker image

docker build -t aiverify-fairness-metrics-toolbox-for-regression:v2.0.0a1 -f stock-plugins/aiverify.stock.fairness-metrics-toolbox-for-regression/algorithms/fairness_metrics_toolbox_for_regression/Dockerfile .

Run the below bash script to run the algorithm

#!/bin/bash
docker run \
    -v $(pwd)/stock-plugins/user_defined_files:/input \
    -v $(pwd)/output:/app/aiverify/output \
  aiverify-fairness-metrics-toolbox-for-regression:v2.0.0a1 \
  --data_path /input/data/sample_reg_pipeline_data.sav \
  --model_path /input/pipeline/regression_tabular_donation \
  --ground_truth_path /input/data/sample_reg_pipeline_ytest_data.sav \
  --ground_truth donation \
  --model_type REGRESSION \
  --run_pipeline \
  --sensitive_features_list gender

If the algorithm runs successfully, the results of the test will be saved in an output folder in the working directory.

Tests

Pytest is used as the testing framework.

Run the following steps to execute the unit and integration tests inside the tests/ folder

docker run --entrypoint python3 aiverify-fairness-metrics-toolbox-for-regression:v2.0.0a1 -m pytest .

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

Built Distribution

File details

Details for the file aiverify_fairness_metrics_toolbox_for_regression-2.0.0a1.tar.gz.

File metadata

File hashes

Hashes for aiverify_fairness_metrics_toolbox_for_regression-2.0.0a1.tar.gz
Algorithm Hash digest
SHA256 d2cd0eef2bfbf441574ab1da92544ecc3b1dcb73cadf8fefba07f7dd1be9e58a
MD5 069848c86323c2634c8ce3c47f60881e
BLAKE2b-256 97da6e38f9fca6838fd949ca99b31127008961ea7f3b0bd4832d6dbe01a87c95

See more details on using hashes here.

File details

Details for the file aiverify_fairness_metrics_toolbox_for_regression-2.0.0a1-py3-none-any.whl.

File metadata

File hashes

Hashes for aiverify_fairness_metrics_toolbox_for_regression-2.0.0a1-py3-none-any.whl
Algorithm Hash digest
SHA256 dfa4b9b1d25a300fe36d596e0fbb70e94a27c26af2b9aedf0e8459eb3de7b628
MD5 abd2d754daafaa50870bcedc1bf62016
BLAKE2b-256 6f040d5b382138c6f90b1d36b4fe991fbfa24121d62211d616d267b96d8467cb

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page