Skip to main content

Library to run Reverse Predictivity

Project description

Reverse Predictivity

A research codebase accompanying the preprint:

Reverse Predictivity: Going Beyond One-Way Mapping to Compare Artificial Neural Network Models and Brains, Muzellec & Kar, bioRxiv (posted August 8, 2025) (biorxiv.org)

This repository supports analyses comparing macaque inferior temporal (IT) cortex responses with artificial neural network (ANN) units—specifically using a reverse predictivity metric that assesses how well neural responses predict ANN activations (biorxiv.org).

Compare brains and models in both directions.

This repository implements reverse predictivity: a complementary evaluation to forward neural predictivity that asks how well do neural responses predict ANN activations? It provides utilities to map macaque IT population responses to model units, quantify bidirectional alignment, and reproduce manuscript figures.

🧠 What is reverse predictivity?

Traditional forward neural predictivity evaluates how well a model’s features linearly predict neural responses. Reverse predictivity inverts that lens: using neural responses to predict model units. Agreement across both directions strengthens claims that a model and a brain area share representations. Practically, this repo includes:

  • Regression utilities to decode IT neurons / ANN unit activations from ANN units / IT population responses
  • Image‑level metrics and correlation suites to compare human/ANN/neural behaviors
  • End‑to‑end notebooks to reproduce figures

🗂️ Repository layout

  • demo_forward_predictivity.ipynb – quick demo of forward mapping model units -> neurons
  • demo_reverse_predictivity.ipynb – quick demo of reverse mapping model units <- neurons
  • demo_generate_neurons_i1.ipynb – compute image‑level neural metrics
  • demo_generate_model_i1.ipynb – compute image‑level model metrics
  • figure[1-6].ipynb – figure reproduction notebooks
  • model_to_monkey.py – utilities for model -> neural regression and evaluation
  • monkey_to_model.py – utilities for model <- neural regression and evaluation
  • correlation_metrics.py – Spearman/Pearson, reliability‑aware correlations, confidence intervals
  • regression_metrics.py – regression helpers
  • prediction_utils.py – shared helpers for prediction/decoding
  • decode_utils.py – train/test splits, cross‑validation, split‑half routines
  • figure_utils.py – journal‑style plotting helpers
  • h5_utils.py – helpers to read/write HDF5 feature and metadata files

📦 Large data files (IT features, image sets) are not stored in the repo. They can be downloaded from: here

🛠️ Installation

We recommend Python ≥3.10 with a fresh environment (Conda or venv).

# Using conda
conda create -n reverse_pred python=3.10 -y
conda activate reverse_pred

# Install core dependencies
pip install numpy scipy scikit-learn matplotlib h5py

📥 Data & preparation

This project assumes access to:

  1. Macaque IT responses: population responses for N images.
    • /neural_data shape (n_images, n_neurons, n_reps)
  2. Model features: precomputed ANN activations for the same images
    • /model_features shape (n_images, n_units)
  3. Humans / Primates behavior: image‑level accuracies
    • /behavior shape (n_images)

🚀 Quickstart

  • demo_forward_predictivity.ipynb – step‑by‑step guide to fitting a model to neuron regression, evaluating correlations.
  • demo_reverse_predictivity.ipynb – end‑to‑end demonstration of neuron to model regression, computing EV/correlation metrics.
  • demo_generate_neurons_i1.ipynb – generates image‑level accuracies from neural decoders.
  • demo_generate_model_i1.ipynb – extracts image‑level model metrics from ANN activations.

🔁 Reproducing manuscript figures

Each figureX.ipynb notebook reproduces the corresponding figure from the preprint. Notebooks expect the data assets described above. If paths differ, change the config cell at the top of each notebook.

  • Figure 1: Forward Predictivity
  • Figure 2: Reverse vs forward predictivity examples
  • Figure 3: Reverse vs forward predictivity accross monkeys and models
  • Figure 4: Influencing factors
  • Figure 5: Analysis of unique units
  • Figure 6: Link with behavior

📌 Status & citation

This codebase accompanies the preprint:

Muzellec, S. & Kar, K. (2025). Reverse Predictivity: Going Beyond One‑Way Mapping to Compare Artificial Neural Network Models and Brains. bioRxiv.

If you use this repository or ideas from it, please cite the preprint and link to this repo.

@article{muzellec_kar_2025_reversepredictivity,
  title  = {Reverse Predictivity: Going Beyond One-Way Mapping to Compare Artificial Neural Network Models and Brains},
  author = {Muzellec, Sabine and Kar, Kohitij},
  year   = {2025},
  journal= {bioRxiv}
}

License: MIT (see LICENSE).

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

reverse_pred-0.0.1.tar.gz (9.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

reverse_pred-0.0.1-py3-none-any.whl (13.9 kB view details)

Uploaded Python 3

File details

Details for the file reverse_pred-0.0.1.tar.gz.

File metadata

  • Download URL: reverse_pred-0.0.1.tar.gz
  • Upload date:
  • Size: 9.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.5

File hashes

Hashes for reverse_pred-0.0.1.tar.gz
Algorithm Hash digest
SHA256 781aeaaaacde301ff2ee487658930ad92bfe85b5df1e45c8f864784286d45e0d
MD5 b4dfc48b2ebea35ba8413c68c9d30958
BLAKE2b-256 34463b62613704761da74001e3b03822cb09fd249229866e354a6a7aeaf2619f

See more details on using hashes here.

File details

Details for the file reverse_pred-0.0.1-py3-none-any.whl.

File metadata

  • Download URL: reverse_pred-0.0.1-py3-none-any.whl
  • Upload date:
  • Size: 13.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.5

File hashes

Hashes for reverse_pred-0.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 0d6b78aeb9ca75bb437cae958d377eb87e2bfebe55c4677bd56314e6ead57880
MD5 fd47303c5e14063ce985989dfeb67f93
BLAKE2b-256 5961fda63c3ac9bdb5d7e0bac2b695f4119a1724f216992f6df99e6f8337f523

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page