Fool AI Framework
Project description
About
A command-line tool for fooling AI/machine learning models. In concrete, it generates adversarial examples to destroy the ability of ML models.
Easy to use, and light-weight.
Supported Methods
This is under development so has few method yet.
Tasks | Attack Methods |
---|---|
Image Classification | Adversarial Examples (FGSM) |
Usage
foolai --help
Fool Image Classification Models on Hugging Face
# --model/-m: Target model. Set repository ID on Hugging Face.
# --img/-i: Original image to be used for generating adversarial examples
foolai fool -m microsoft/resnet-50 -i dog.jpg
Installation
From Pip
pip install foolai
foolai --help
From Git Repo
git clone https://github.com/hideckies/foolai.git
cd foolai
poetry shell
poetry install
foolai --help
Disclaimer
- It's an experimental project and ML models are constantly evolving, so it may not necessarily work well.
- Currently its target is Hugging Face models only yet.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
foolai-0.0.1.tar.gz
(12.5 kB
view hashes)
Built Distribution
foolai-0.0.1-py3-none-any.whl
(15.5 kB
view hashes)