Skip to main content

Personal toolbox for experimenting with Feature Visualization

Project description

Horama: A Compact Library for Feature Visualization Experiments

Horama provides the implementation code for the research paper:

  • Unlocking Feature Visualization for Deeper Networks with MAgnitude Constrained Optimization by Thomas Fel*, Thibaut Boissin*, Victor Boutin*, Agustin Picard*, Paul Novello*, Julien Colin, Drew Linsley, Tom Rousseau, Rémi Cadène, Laurent Gardes, Thomas Serre. Read the paper on arXiv.

In addition, this repository introduces various feature visualization methods, including a reimagined approach to the incredible work of the Clarity team and an implementation of Feature Accentuation from Hamblin & al. For an official reproduction of distill's work complete with comprehensive notebooks, we highly recommend Lucent. However, Horama focuses on experimentation within PyTorch, offering a compact and modifiable codebase.

🚀 Getting Started with Horama

Horama requires Python 3.6 or newer and several dependencies, including Numpy. It supports both Tensorflow and Torch. Installation is straightforward with Pypi:

pip install horama

With Horama installed, you can dive into feature visualization. The API is designed to be intuitive across both Tensorflow and Pytorch frameworks, requiring only a few hyperparameters to get started.

Example usage:

import torch
import timm
from horama import maco, fourier, plot_maco

%config InlineBackend.figure_format = 'retina'

model = timm.create_model('resnet18', pretrained=True).cuda().eval()

objective = lambda images: torch.mean(model(images)[:, 1])

image1, alpha1 = maco(objective)
plot_maco(image1, alpha1)

image2, alpha2 = fourier(objective)
plot_maco(image2, alpha2)

Complete API

Complete API Guide Horama's API includes the following primary functions:

maco(objective_function,
     total_steps=1000,
     learning_rate=1.0,
     image_size=1000,
     model_input_size=224,
     noise=0.1,
     values_range=(-2.5, 2.5),
     crops_per_iteration=6,
     box_size=(0.20, 0.25),
     device='cuda')

fourier(objective_function,
        decay_power=1.5,
        total_steps=1000,
        learning_rate=1.0,
        image_size=1000,
        model_input_size=224,
        noise=0.1,
        values_range=(-2.5, 2.5),
        crops_per_iteration=6,
        box_size=(0.20, 0.25),
        device='cuda')

When optimizing, it's crucial to fine-tune the hyperparameters. Parameters like the decay spectrum in the Fourier method significantly impact the visual output, controlling the energy distribution across frequencies. Additionally, adjust the values_range to match your model's preprocessing requirements, and ensure model_input_size matches the expected input size of your model. Typically, setting the noise parameter to about 10% of the input range yields satisfactory results.

Citation

@article{fel2023maco,
      title={Unlocking Feature Visualization for Deeper Networks with MAgnitude Constrained Optimization},
      author={Thomas, Fel and Thibaut, Boissin and Victor, Boutin and Agustin, Picard and Paul, Novello and Julien, Colin and Drew, Linsley and Tom, Rousseau and Rémi, Cadène and Laurent, Gardes and Thomas, Serre},
      year={2023},
}

Additional Resources

For a simpler and maintenance-friendly implementation for TensorFlow and more on feature visualization methods, check out the Xplique toolbox.

A simpler and maintain implementation of the code for Tensorflow and the other feature visualization methods used in the paper come from the Xplique toolbox. Additionally, we have created a website called the LENS Project, which features the 1000 classes of ImageNet.

Authors of the code

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

horama-0.0.2.tar.gz (7.4 kB view details)

Uploaded Source

Built Distribution

Horama-0.0.2-py3-none-any.whl (9.1 kB view details)

Uploaded Python 3

File details

Details for the file horama-0.0.2.tar.gz.

File metadata

  • Download URL: horama-0.0.2.tar.gz
  • Upload date:
  • Size: 7.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.12.3

File hashes

Hashes for horama-0.0.2.tar.gz
Algorithm Hash digest
SHA256 c724e01b5baf8725577426b24a032882250c92ac65c3227291568b42ded7d6ad
MD5 dcc5810c35960b747acdb64c187a2e89
BLAKE2b-256 4d26e2469758d4c00400e67e8ae524dd8341f7328b249153deff24bae6852eba

See more details on using hashes here.

Provenance

File details

Details for the file Horama-0.0.2-py3-none-any.whl.

File metadata

  • Download URL: Horama-0.0.2-py3-none-any.whl
  • Upload date:
  • Size: 9.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.12.3

File hashes

Hashes for Horama-0.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 c4b35620b88fafd92af74a7f40f8bc7e1ebfd51369ef68ec724456c4a0b3d814
MD5 bcab7e789c928c33d836d90fd47d6194
BLAKE2b-256 f82deff1e050eb555f7d6ea10f669c088cf4275157e96f1b9f6391466704ea65

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page