Skip to main content

Personal toolbox for experimenting with Feature Visualization

Project description

Horama: A Compact Library for Feature Visualization Experiments

Horama provides the implementation code for the research paper:

  • Unlocking Feature Visualization for Deeper Networks with MAgnitude Constrained Optimization by Thomas Fel*, Thibaut Boissin*, Victor Boutin*, Agustin Picard*, Paul Novello*, Julien Colin, Drew Linsley, Tom Rousseau, Rémi Cadène, Laurent Gardes, Thomas Serre. Read the paper on arXiv.

In addition, this repository introduces various feature visualization methods, including a reimagined approach to the incredible work of the Clarity team and an implementation of Feature Accentuation from Hamblin & al. For an official reproduction of distill's work complete with comprehensive notebooks, we highly recommend Lucent. However, Horama focuses on experimentation within PyTorch, offering a compact and modifiable codebase.

🚀 Getting Started with Horama

Horama requires Python 3.6 or newer and several dependencies, including Numpy. It supports both Tensorflow and Torch. Installation is straightforward with Pypi:

pip install horama

With Horama installed, you can dive into feature visualization. The API is designed to be intuitive across both Tensorflow and Pytorch frameworks, requiring only a few hyperparameters to get started.

Example usage:

import torch
import timm
from horama import maco, fourier, plot_maco

%config InlineBackend.figure_format = 'retina'

model = timm.create_model('resnet18', pretrained=True).cuda().eval()

objective = lambda images: torch.mean(model(images)[:, 1])

image1, alpha1 = maco(objective)
plot_maco(image1, alpha1)

image2, alpha2 = fourier(objective)
plot_maco(image2, alpha2)

Complete API

Complete API Guide Horama's API includes the following primary functions:

maco(objective_function,
     total_steps=1000,
     learning_rate=1.0,
     image_size=1000,
     model_input_size=224,
     noise=0.1,
     values_range=(-2.5, 2.5),
     crops_per_iteration=6,
     box_size=(0.20, 0.25),
     device='cuda')

fourier(objective_function,
        decay_power=1.5,
        total_steps=1000,
        learning_rate=1.0,
        image_size=1000,
        model_input_size=224,
        noise=0.1,
        values_range=(-2.5, 2.5),
        crops_per_iteration=6,
        box_size=(0.20, 0.25),
        device='cuda')

When optimizing, it's crucial to fine-tune the hyperparameters. Parameters like the decay spectrum in the Fourier method significantly impact the visual output, controlling the energy distribution across frequencies. Additionally, adjust the values_range to match your model's preprocessing requirements, and ensure model_input_size matches the expected input size of your model. Typically, setting the noise parameter to about 10% of the input range yields satisfactory results.

Citation

@article{fel2023maco,
      title={Unlocking Feature Visualization for Deeper Networks with MAgnitude Constrained Optimization},
      author={Thomas, Fel and Thibaut, Boissin and Victor, Boutin and Agustin, Picard and Paul, Novello and Julien, Colin and Drew, Linsley and Tom, Rousseau and Rémi, Cadène and Laurent, Gardes and Thomas, Serre},
      year={2023},
}

Additional Resources

For a simpler and maintenance-friendly implementation for TensorFlow and more on feature visualization methods, check out the Xplique toolbox.

A simpler and maintain implementation of the code for Tensorflow and the other feature visualization methods used in the paper come from the Xplique toolbox. Additionally, we have created a website called the LENS Project, which features the 1000 classes of ImageNet.

Authors of the code

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

horama-0.0.3.tar.gz (7.5 kB view details)

Uploaded Source

Built Distribution

Horama-0.0.3-py3-none-any.whl (9.1 kB view details)

Uploaded Python 3

File details

Details for the file horama-0.0.3.tar.gz.

File metadata

  • Download URL: horama-0.0.3.tar.gz
  • Upload date:
  • Size: 7.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.12.3

File hashes

Hashes for horama-0.0.3.tar.gz
Algorithm Hash digest
SHA256 4ee2217d02b0f815d566d17c2e239dc95b0b9b6e0a634e33ae9b59f1b98ebd67
MD5 566e396e7413f402256e9745d715bb37
BLAKE2b-256 830dfae00559da5f5598bf3e980bf0a83aeae4abdf1aaa9be6b0673f897e8115

See more details on using hashes here.

Provenance

File details

Details for the file Horama-0.0.3-py3-none-any.whl.

File metadata

  • Download URL: Horama-0.0.3-py3-none-any.whl
  • Upload date:
  • Size: 9.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.12.3

File hashes

Hashes for Horama-0.0.3-py3-none-any.whl
Algorithm Hash digest
SHA256 3f05718a4397de26197b3d35679f7acbbb499b07e7e8ad1b632fb6cdf6bf5bde
MD5 3654ae9f06408622da1ddd9195d4606b
BLAKE2b-256 7192fc4b0cd735eaf7fd2a8d46aa429c4564580e2bfd43a97ff12b3b3f3a119a

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page