Skip to main content

AI Verify Robustness Toolbox generates a perturbed dataset using boundary attack algorithm on the test dataset.

Project description

Algorithm - Robustness Toolbox

License

  • Licensed under Apache Software License 2.0

Developers:

  • AI Verify

Installation

Each test algorithm can now be installed via pip and run individually.

pip install aiverify-robustness-toolbox

Example Usage:

The robustness plugin supports testing of tabular and image data.

Run the following bash script to execute the plugin on tabular data:

#!/bin/bash

root_path="<PATH_TO_FOLDER>/aiverify/stock-plugins/user_defined_files"
python -m aiverify_robustness_toolbox \
  --data_path $root_path/data/sample_bc_pipeline_credit_data.sav \
  --model_path $root_path/pipeline/bc_tabular_credit \
  --ground_truth_path $root_path/data/sample_bc_pipeline_credit_ytest_data.sav \
  --ground_truth default \
  --model_type CLASSIFICATION \
  --run_pipeline

Run the following bash script to execute the plugin on image data:

#!/bin/bash

root_path="<PATH_TO_FOLDER>/aiverify/stock-plugins/user_defined_files"
python -m aiverify_robustness_toolbox \
  --data_path $root_path/data/raw_fashion_image_10/0.png \
  --model_path $root_path/pipeline/mc_image_fashion \
  --ground_truth_path $root_path/data/pickle_pandas_fashion_mnist_annotated_labels_10.sav \
  --ground_truth label \
  --model_type CLASSIFICATION \
  --run_pipeline \
  --annotated_ground_truth_path $root_path/data/pickle_pandas_fashion_mnist_annotated_labels_10.sav \
  --file_name_label file_name

If the algorithm runs successfully, the results of the test will be saved in an output folder.

Develop plugin locally

Assuming aiverify-test-engine has already been installed in the virtual environment, run the following bash script to install the plugin and execute a test:

#!/bin/bash

# setup virtual environment
python3 -m venv .venv
source .venv/bin/activate

# install plugin
cd aiverify/stock-plugins/aiverify.stock.robustness-toolbox/algorithms/robustness_toolbox
pip install .

python -m aiverify_robustness_toolbox --data_path  <data_path> --model_path <model_path> --ground_truth_path <ground_truth_path> --ground_truth <str> --model_type CLASSIFICATION --run_pipeline --annotated_ground_truth_path <str> --file_name_label <str>

Build Plugin

cd aiverify/stock-plugins/aiverify.stock.robustness-toolbox/algorithms/robustness_toolbox
hatch build

Tests

Pytest is used as the testing framework.

Run the following steps to execute the unit and integration tests inside the tests/ folder

cd aiverify/stock-plugins/aiverify.stock.robustness-toolbox/algorithms/robustness_toolbox
pytest .

Run using Docker

In the aiverify root directory, run the below command to build the docker image

docker build -t aiverify-robustness-toolbox -f stock-plugins/aiverify.stock.robustness-toolbox/algorithms/robustness_toolbox/Dockerfile .

Run the below bash script to run the algorithm

#!/bin/bash
docker run \
  -v $(pwd)/stock-plugins/user_defined_files:/input \
  -v $(pwd)/stock-plugins/aiverify.stock.robustness-toolbox/algorithms/robustness_toolbox/output:/app/aiverify/output \
  aiverify-robustness-toolbox \
  --data_path /input/data/raw_fashion_image_10 \
  --model_path /input/pipeline/mc_image_fashion \
  --ground_truth_path /input/data/pickle_pandas_fashion_mnist_annotated_labels_10.sav \
  --ground_truth label \
  --model_type CLASSIFICATION \
  --run_pipeline \
  --annotated_ground_truth_path /input/data/pickle_pandas_fashion_mnist_annotated_labels_10.sav \
  --file_name_label file_name

If the algorithm runs successfully, the results of the test will be saved in an output folder in the algorithm directory.

Tests

Pytest is used as the testing framework.

Run the following steps to execute the unit and integration tests inside the tests/ folder

docker run \
  --entrypoint python3 \
  -w /app/aiverify/stock-plugins/aiverify.stock.robustness-toolbox/algorithms/robustness_toolbox \
  aiverify-robustness-toolbox \
  -m pytest .

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

aiverify_robustness_toolbox-2.2.1.tar.gz (24.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

aiverify_robustness_toolbox-2.2.1-py3-none-any.whl (27.6 kB view details)

Uploaded Python 3

File details

Details for the file aiverify_robustness_toolbox-2.2.1.tar.gz.

File metadata

  • Download URL: aiverify_robustness_toolbox-2.2.1.tar.gz
  • Upload date:
  • Size: 24.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: Hatch/1.16.5 cpython/3.11.15 HTTPX/0.28.1

File hashes

Hashes for aiverify_robustness_toolbox-2.2.1.tar.gz
Algorithm Hash digest
SHA256 9efabd59c4542e52551612c9ecf7d811a23c834467c7fe40477405b17f4f7df7
MD5 0c0ac63815f20d5cfddb059a30243aae
BLAKE2b-256 8f5a4cb9c7ce72e07e80353818bf20c31434b314ef87e1581ea3592dd8ed04b9

See more details on using hashes here.

File details

Details for the file aiverify_robustness_toolbox-2.2.1-py3-none-any.whl.

File metadata

File hashes

Hashes for aiverify_robustness_toolbox-2.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 9735b6da5e41ddbdd61f9ed5c2c0cd311401082318870a5e11c8ce8f3a5b4f62
MD5 061acbc476662775647affced3fa334d
BLAKE2b-256 df6b61940108b2e609349ec7bdc4cd83334350af9dbc77fb5066f3c35a88caf9

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page