Skip to main content

Sparse Autoencoder for Mechanistic Interpretability

Project description

Sparse Autoencoder

PyPI PyPI - License Checks Release

A sparse autoencoder for mechanistic interpretability research.

Read the Docs Here

Train a Sparse Autoencoder in colab, or install for your project:

pip install sparse_autoencoder

Features

This library contains:

  1. A sparse autoencoder model, along with all the underlying PyTorch components you need to customise and/or build your own:
    • Encoder, constrained unit norm decoder and tied bias PyTorch modules in autoencoder.
    • L1 and L2 loss modules in loss.
    • Adam module with helper method to reset state in optimizer.
  2. Activations data generator using TransformerLens, with the underlying steps in case you want to customise the approach:
    • Activation store options (in-memory or on disk) in activation_store.
    • Hook to get the activations from TransformerLens in an efficient way in source_model.
    • Source dataset (i.e. prompts to generate these activations) utils in source_data, that stream data from HuggingFace and pre-process (tokenize & shuffle).
  3. Activation resampler to help reduce the number of dead neurons.
  4. Metrics that log at various stages of training (e.g. during training, resampling and validation), and integrate with wandb.
  5. Training pipeline that combines everything together, allowing you to run hyperparameter sweeps and view progress on wandb.

Designed for Research

The library is designed to be modular. By default it takes the approach from Towards Monosemanticity: Decomposing Language Models With Dictionary Learning , so you can pip install the library and get started quickly. Then when you need to customise something, you can just extend the class for that component (e.g. you can extend SparseAutoencoder if you want to customise the model, and then drop it back into the training pipeline. Every component is fully documented, so it's nice and easy to do this.

Demo

Check out the demo notebook docs/content/demo.ipynb for a guide to using this library.

Contributing

This project uses Poetry for dependency management, and PoeThePoet for scripts. After checking out the repo, we recommend setting poetry's config to create the .venv in the root directory (note this is a global setting) and then installing with the dev and demos dependencies.

poetry config virtualenvs.in-project true
poetry install --with dev,demos

Checks

For a full list of available commands (e.g. test or typecheck), run this in your terminal (assumes the venv is active already).

poe

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

sparse_autoencoder-1.10.0.tar.gz (92.6 kB view details)

Uploaded Source

Built Distribution

sparse_autoencoder-1.10.0-py3-none-any.whl (137.3 kB view details)

Uploaded Python 3

File details

Details for the file sparse_autoencoder-1.10.0.tar.gz.

File metadata

  • Download URL: sparse_autoencoder-1.10.0.tar.gz
  • Upload date:
  • Size: 92.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.7.1 CPython/3.10.12 Linux/6.2.0-1018-azure

File hashes

Hashes for sparse_autoencoder-1.10.0.tar.gz
Algorithm Hash digest
SHA256 1204deea3c3f0cf03174d27d9c926c9d7e8f72a681506f9ed984097d9a4ac151
MD5 4a1fd1be8787091772edfaa93e20148c
BLAKE2b-256 65ad3eed1de8d60804e455165b270abf9fb11017b9fdeca57f0ac93cfda426c2

See more details on using hashes here.

File details

Details for the file sparse_autoencoder-1.10.0-py3-none-any.whl.

File metadata

File hashes

Hashes for sparse_autoencoder-1.10.0-py3-none-any.whl
Algorithm Hash digest
SHA256 2013aebe12dcc94a101f9e2ab73789646c8c21870780746ba68e68ac6bf3f8f7
MD5 1d0f0d98c4a72f9d783823099d025fa0
BLAKE2b-256 a7d481cc2465cc663bea3a5006bcd342908a423fffb3f14a2314a177a875310c

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page