No project description provided
Project description
pattern-lens
visualization of LLM attention patterns and things computed about them
pattern-lens makes it easy to:
- Generate visualizations of attention patterns, or figures computed from attention patterns, from models supported by TransformerLens
- Compare generated figures across models, layers, and heads in an interactive web interface
Installation
pip install pattern-lens
Usage
The pipeline is as follows:
- Generate attention patterns using
pattern_lens.activations.acitvations_main(), saving them innpzfiles - Generate visualizations using
pattern_lens.figures.figures_main()-- read thenpzfiles, pass each attention pattern to each visualization function, and save the resulting figures - Serve the web interface using
pattern_lens.server-- web interface reads metadata in json/jsonl files, then lets the user select figures to show
Basic CLI
Generate attention patterns and default visualizations:
# generate activations
python -m pattern_lens.activations --model gpt2 --prompts data/pile_1k.jsonl --save-path attn_data
# create visualizations
python -m pattern_lens.figures --model gpt2 --save-path attn_data
serve the web UI:
python -m pattern_lens.server --path attn_data
Web UI
View a demo of the web UI at miv.name/pattern-lens/demo.
Custom Figures
Add custom visualization functions by decorating them with @register_attn_figure_func. You should still generate the activations first:
python -m pattern_lens.activations --model gpt2 --prompts data/pile_1k.jsonl --save-path attn_data
and then write+run a script/notebook that looks something like this:
import numpy as np
import matplotlib.pyplot as plt
from scipy.linalg import svd
# these functions simplify writing a function which saves a figure
from pattern_lens.figure_util import matplotlib_figure_saver, save_matrix_wrapper
# decorator to register your function, such that it will be run by `figures_main`
from pattern_lens.attn_figure_funcs import register_attn_figure_func
# runs the actual figure generation pipeline
from pattern_lens.figures import figures_main
# define your own functions
# this one uses `matplotlib_figure_saver` -- define a function that takes matrix and `plt.Axes`, modify the axes
@register_attn_figure_func
@matplotlib_figure_saver(fmt="svgz")
def svd_spectra(attn_matrix: np.ndarray, ax: plt.Axes) -> None:
# Perform SVD
U, s, Vh = svd(attn_matrix)
# Plot singular values
ax.plot(s, "o-")
ax.set_yscale("log")
ax.set_xlabel("Singular Value Index")
ax.set_ylabel("Singular Value")
ax.set_title("Singular Value Spectrum of Attention Matrix")
# run the figures pipelne
# run the pipeline
figures_main(
model_name="pythia-14m",
save_path=Path("docs/demo/"),
n_samples=5,
force=False,
)
see demo.ipynb for a full example
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file pattern_lens-0.2.0.tar.gz.
File metadata
- Download URL: pattern_lens-0.2.0.tar.gz
- Upload date:
- Size: 16.5 MB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.0 CPython/3.12.4
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
15bf904805150e108f146c99b0d0ecfd7f6be2bc351df8e3af8bb2472c0a044f
|
|
| MD5 |
7cf2d42bd42c7e17db56b426ef6a880f
|
|
| BLAKE2b-256 |
b1e721c6a6cfb194f9bd51f6541cc45a9dcd57dd43ea5ca22530151b0f76abd4
|
File details
Details for the file pattern_lens-0.2.0-py3-none-any.whl.
File metadata
- Download URL: pattern_lens-0.2.0-py3-none-any.whl
- Upload date:
- Size: 46.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.0 CPython/3.12.4
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ba48f5113f41f6608ac6d2d19aff0d58d91bc7736fa801351e5a5a60562235fd
|
|
| MD5 |
b24716b290bdcdbe7b1d68cf2d99bca9
|
|
| BLAKE2b-256 |
9e6b90bd639fcc9f5df0d3b9ea05c4733876fef1d9ff0da7a882a52acd9feee9
|