Skip to main content

Fool AI Framework

Project description

About

A command-line tool for fooling AI/machine learning models. In concrete, it generates adversarial examples to destroy the ability of ML models.

Easy to use, and light-weight.

Supported Methods

This is under development so has few method yet.

Tasks Attack Methods
Image Classification Adversarial Examples (FGSM)

Usage

foolai --help

Fool Image Classification Models on Hugging Face

# --model/-m: Target model. Set repository ID on Hugging Face.
# --img/-i: Original image to be used for generating adversarial examples
foolai fool -m microsoft/resnet-50 -i dog.jpg

Installation

From Pip

pip install foolai
foolai --help

From Git Repo

git clone https://github.com/hideckies/foolai.git
cd foolai
poetry shell
poetry install
foolai --help

Disclaimer

  • It's an experimental project and ML models are constantly evolving, so it may not necessarily work well.
  • Currently its target is Hugging Face models only yet.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

foolai-0.0.1.tar.gz (12.5 kB view hashes)

Uploaded Source

Built Distribution

foolai-0.0.1-py3-none-any.whl (15.5 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page