Skip to main content

For OpenMOSS Mechanistic Interpretability Team's Sparse Autoencoder (SAE) research. Open-sourced and constantly updated.

Project description

Language-Model-SAEs

[!IMPORTANT] Currently the examples are outdated and some parallelism strategies are not working due to lack of bandwidth. We are working on better organizing recent updates and will make everything work ASAP.

Language-Model-SAEs is a comprehensive, fully-distributed framework designed for training, analyzing and visualizing Sparse Autoencoders (SAEs), empowering scalable and systematic Mechanistic Interpretability research.

News

Features

  • Scalability: Our framework is fully distributed with arbitrary combinations of data, model, and head parallelism for both training and analysis. Enjoy training SAEs with millions of features!
  • Flexibility: We support a wide range of SAE variants, including vanilla SAEs, Lorsa (Low-rank Sparse Attention), CLT (Cross-layer Transcoder), MOLT (Mixture of Linear Transforms), Crosscoder, and more. Each variant can be combined with different activation functions (e.g., ReLU, JumpReLU, TopK, BatchTopK) and sparsity penalties (e.g., L1, Tanh).
  • Easy to Use: We provide high-level runners APIs to quickly launch experiments with simple configurations. Check our examples for verified hyperparameters.
  • Visualization: We provide a unified web interface to visualize learned SAE variants and their features.

Installation

Use pip to install Language-Model-SAEs:

pip install lm-saes==2.0.0b25

We also highly recommend using uv to manage your own project dependencies. You can use

uv add lm-saes==2.0.0b25

to add Language-Model-SAEs as your project dependency.

Development

We use uv to manage the dependencies, which is an alternative to poetry or pdm. To install the required packages, just install uv, and run the following command:

uv sync

This will install all the required packages for the codebase in .venv directory. For Ascend NPU support, run

uv sync --extra npu

If you want to use the visualization tools, you also need to install the required packages for the frontend, which uses bun for dependency management. Follow the instructions on the website to install it, and then run the following command:

cd ui
bun install

Launch an Experiment

Explore the examples to check the basic usage of training/analyzing SAEs in different configurations. Note a MongoDB is recommended for recording the model/dataset/SAE configurations and required for storing analyses. For more advanced usage, you may explore src/lm_saes/runners folder for the interface for generating activations and training & analyzing SAE variants, and directly write your own variant of training/analyzing script at the runner level.

Visualizing the Learned Dictionary

The analysis results will be saved using MongoDB, and you can use the provided visualization tools to visualize the learned dictionary. First, start the FastAPI server by running the following command:

uvicorn server.app:app --port 24577 --env-file server/.env

Then, copy the ui/.env.example file to ui/.env and modify the BACKEND_URL to fit your server settings (by default, it's http://localhost:24577), and start the frontend by running the following command:

cd ui
bun dev --port 24576

That's it! You can now go to http://localhost:24576 to visualize the learned dictionary and its features.

Development

We highly welcome contributions to this project. If you have any questions or suggestions, feel free to open an issue or a pull request. We are looking forward to hearing from you!

TODO: Add development guidelines

Acknowledgement

The design of the pipeline (including the configuration and some training details) is highly inspired by the mats_sae_training project (now known as SAELens) and heavily relies on the TransformerLens library. We thank the authors for their great work.

Citation

Please cite this library as:

@misc{Ge2024OpenMossSAEs,
    title  = {OpenMoss Language Model Sparse Autoencoders},
    author = {Xuyang Ge, Wentao Shu, Junxuan Wang, Guancheng Zhou, Jiaxing Wu, Fukang Zhu, Lingjie Chen, Zhengfu He},
    url    = {https://github.com/OpenMOSS/Language-Model-SAEs},
    year   = {2024}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lm_saes-2.0.0b25.tar.gz (224.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

lm_saes-2.0.0b25-py3-none-any.whl (270.9 kB view details)

Uploaded Python 3

File details

Details for the file lm_saes-2.0.0b25.tar.gz.

File metadata

  • Download URL: lm_saes-2.0.0b25.tar.gz
  • Upload date:
  • Size: 224.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for lm_saes-2.0.0b25.tar.gz
Algorithm Hash digest
SHA256 96e340641e43803fe1bb2f05bd86d9cb99d374dbe23d47a1d18d0698e1601283
MD5 11b869229e553c8ca1801420651dbb35
BLAKE2b-256 29bbaa94b19b34c858e33ba2e29454b973b95697fb970c1e36c1c2cc6938da8f

See more details on using hashes here.

Provenance

The following attestation bundles were made for lm_saes-2.0.0b25.tar.gz:

Publisher: publish.yml on OpenMOSS/Language-Model-SAEs

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file lm_saes-2.0.0b25-py3-none-any.whl.

File metadata

  • Download URL: lm_saes-2.0.0b25-py3-none-any.whl
  • Upload date:
  • Size: 270.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for lm_saes-2.0.0b25-py3-none-any.whl
Algorithm Hash digest
SHA256 60129b8afc6de186c9fd9c65ad96626f07dd19f485155fbac39ad4899b39cf3c
MD5 fa5c3c2ec39cc34e0a756e0cf2a24014
BLAKE2b-256 8555054715c97dd6ade403a8750d4645a56f17b4dea6bfef66de313d6bc284c4

See more details on using hashes here.

Provenance

The following attestation bundles were made for lm_saes-2.0.0b25-py3-none-any.whl:

Publisher: publish.yml on OpenMOSS/Language-Model-SAEs

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page