Skip to main content

Sparse probing benchmark for Sparse Autoencoders derived from the paper "Are Sparse Autoencoders Useful? A Case Study in Sparse Probing"

Project description

SAE Probes Benchmark

PyPI License: MIT build

This repository contains the code for the paper Are Sparse Autoencoders Useful? A Case Study in Sparse Probing, but has been reformatted into a Python package that will work with any SAE that can be loaded in SAELens. This makes it easy to use the sparse probing tasks from the paper as a standalone SAE benchmark.

Installation

pip install sae-probes

Running evaluations

You can run benchmarks directly; any missing model activations are generated on demand. If you don't pass a model_cache_path, a temporary directory is used and cleaned up when the function completes. To persist activations across runs (recommended for repeated experiments), provide a model_cache_path.

Training Probes

Probes can be trained directly on the model activations (baselines) or on SAE activations. In both cases, the following test data-balance settings are available: "normal", "scarcity", and "imbalance". For more details about these settings, see the original paper. For the most standard sparse-probing benchmark, use the normal setting.

SAE Probes

The most standard use of this library is as a sparse probing benchmark for SAEs using the normal setting. This is demonstrated below:

from sae_probes import run_sae_evals
from sae_lens import SAE

# run the benchmark on a Gemma Scope SAE
release = "gemma-scope-2b-pt-res-canonical"
sae_id = "layer_12/width_16k/canonical"
sae = SAE.from_pretrained(release, sae_id)

run_sae_evals(
  sae=sae,
  model_name="gemma-2-2b",
  hook_name="blocks.12.hook_resid_post",
  reg_type="l1",
  setting="normal",
  results_path="/results/output/path",
  # model_cache_path is optional; if omitted, a temp dir is used and cleared after
  model_cache_path="/path/to/saved/activations",
  ks=[1, 16],
)

The sparse probing results for each dataset will be saved to results_path as a JSON file per dataset.

Baseline Probes

You can now run baseline probes using a unified API that matches the SAE evaluation interface:

from sae_probes import run_baseline_evals

# Run baseline probes with consistent API
run_baseline_evals(
  model_name="gemma-2-2b",
  hook_name="blocks.12.hook_resid_post",
  setting="normal",  # or "scarcity", "imbalance"
  results_path="/results/output/path",
  # model_cache_path is optional; if omitted, a temp dir is used and cleared after
  model_cache_path="/path/to/saved/activations",
)

Output Format

Both SAE and baseline probes now save results as JSON files with consistent structure:

  • SAE results: sae_probes_{model_name}/{setting}_setting/{dataset}_{hook_name}_{reg_type}.json
  • Baseline results: baseline_results_{model_name}/{setting}_setting/{dataset}_{hook_name}_{method}.json

Each JSON file contains a list with metrics and metadata for easy comparison between SAE and baseline approaches.

Optional: Pre-generating model activations

Pre-generating can speed up repeated runs and lets you inspect the saved tensors. It's optional because benchmarks will auto-generate missing activations on their first run if missing.

from sae_probes import generate_dataset_activations

generate_dataset_activations(
  model_name="gemma-2-2b", # the TransformerLens name of the model
  hook_names=["blocks.12.hook_resid_post"], # Any TLens hook names
  batch_size=64,
  device="cuda",
  model_cache_path="/path/to/save/activations",
)

If you skip pre-generation, the benchmarks will create any missing activations automatically. Passing a model_cache_path persists them; if omitted, activations will be written to a temporary directory that is deleted after the run.

Citation

If you use this code in your research, please cite:

@inproceedings{kantamnenisparse,
  title={Are Sparse Autoencoders Useful? A Case Study in Sparse Probing},
  author={Kantamneni, Subhash and Engels, Joshua and Rajamanoharan, Senthooran and Tegmark, Max and Nanda, Neel},
  booktitle={Forty-second International Conference on Machine Learning}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

sae_probes-0.1.4.tar.gz (45.0 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

sae_probes-0.1.4-py3-none-any.whl (45.0 MB view details)

Uploaded Python 3

File details

Details for the file sae_probes-0.1.4.tar.gz.

File metadata

  • Download URL: sae_probes-0.1.4.tar.gz
  • Upload date:
  • Size: 45.0 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for sae_probes-0.1.4.tar.gz
Algorithm Hash digest
SHA256 189ae7e9dbc6d3f7f29493dd5a9cf097229377c0a6e3d1bee07da474ca0a701a
MD5 c212a175f2d0b76d2a72b9494b9cc293
BLAKE2b-256 3a6e25161d658c262a6872c969c0302c41a8e5a8e4b914eb1edcc3c70a3104c2

See more details on using hashes here.

Provenance

The following attestation bundles were made for sae_probes-0.1.4.tar.gz:

Publisher: ci.yaml on sae-probes/sae-probes

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file sae_probes-0.1.4-py3-none-any.whl.

File metadata

  • Download URL: sae_probes-0.1.4-py3-none-any.whl
  • Upload date:
  • Size: 45.0 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for sae_probes-0.1.4-py3-none-any.whl
Algorithm Hash digest
SHA256 96a26a26b983449842e0682886d7aa382d28cace692b73da7d278e6b79e9ef13
MD5 79879bef5b07b17a53dafb412fbe2e96
BLAKE2b-256 9e7cda17d9a6e81bc31dc279cabe922015704a0abc7fe3a0ffce89979f41cc0f

See more details on using hashes here.

Provenance

The following attestation bundles were made for sae_probes-0.1.4-py3-none-any.whl:

Publisher: ci.yaml on sae-probes/sae-probes

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page