Skip to main content

Simple library for loading checkpoints of language models.

Project description

🤖🚩lm-checkpoints

Simple library for dealing with language model checkpoints to study training dynamics.

lm-checkpoints should make it easier to work with intermediate training checkpoints that are provided for some language models (LMs), like MultiBERTs and Pythia. This library allows you to iterate over the training steps, to define different subsets, to automatically clear the cache for previously seen checkpoints, etc. Nothing fancy, simply a wrapper for 🤗 models that should make it easier to study their training dynamics.

Install using pip install lm-checkpoints.

Checkpoints

Currently implemented for the following models on HuggingFace:

Example

Say you want to compute some metrics for all model checkpoints of Pythia 160m, but only seed 0.

from lm_checkpoints import PythiaCheckpoints

for ckpt in PythiaCheckpoints(size=160,seed=[0]):
    # Do something with ckpt.model, ckpt.config or ckpt.tokenizer
    print(ckpt.config)

Or if you only want to load steps 0, 1, 2, 4, 8, 16 for all available seeds:

from lm_checkpoints import PythiaCheckpoints

for ckpt in PythiaCheckpoints(size=160,step=[0, 1, 2, 4, 8, 16]):
    # Do something with ckpt.model, ckpt.config or ckpt.tokenizer
    print(ckpt.config)

Alternatively, you may want to load all final checkpoints of MultiBERTs:

from lm_checkpoints import MultiBERTCheckpoints

for ckpt in MultiBERTCheckpoints.final_checkpoints():
    # Do something with ckpt.model, ckpt.config or ckpt.tokenizer
    print(ckpt.config)

Loading "chunks" of checkpoints for parallel computations

It is possible to split the checkpoints in N "chunks", e.g., useful if you want to run computations in parallel:

chunks = []
checkpoints = PythiaCheckpoints(size=160,seed=[0])
for chunk in checkpoints.split(N):
    chunks.append(chunk)

Dealing with limited disk space

In case you don't want the checkpoints to fill up your disk space, use clean_cache=True to delete earlier checkpoints when iterating over these models (NB: You have to redownload these if you run it again!):

from lm_checkpoints import PythiaCheckpoints

for ckpt in PythiaCheckpoints(size=14,clean_cache=True):
    # Do something with ckpt.model or ckpt.tokenizer

Evaluating checkpoints using lm-evaluation-harness

If you install lm-checkpoints with the eval option (pip install lm-checkpoints[eval]), you can use the evaluate function to run lm-evaluation-harness for all checkpoints:

from lm_checkpoints import evaluate, PythiaCheckpoints

ckpts = PythiaCheckpoints(size=14, step=[0, 1, 2, 4], seed=[0], device="cuda")

evaluate(
    ckpts,
    tasks=["triviaqa", "crows_pairs_english"],
    output_dir="test_results",
    log_samples=True,
    skip_if_exists=True,
#    limit=5, # For testing purposes!
)

Or you can use the evaluate_checkpoints script:

evaluate_checkpoints pythia --output test_results --size 14 --seed 1 --step 0 1 2 --tasks blimp crows_pairs_english --device cuda --skip_if_exists

Both examples will create a subdirectory structure in test_results/ for each model and step. This will contain a results json file (e.g., results_crows_pairs_english,triviaqa.json), and if using the --log_samples option, a jsonl file containing the LM responses to the individual test items for each task (e.g., samples_triviaqa.jsonl).

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lm_checkpoints-0.1.13.tar.gz (18.0 kB view details)

Uploaded Source

Built Distribution

lm_checkpoints-0.1.13-py3-none-any.whl (34.5 kB view details)

Uploaded Python 3

File details

Details for the file lm_checkpoints-0.1.13.tar.gz.

File metadata

  • Download URL: lm_checkpoints-0.1.13.tar.gz
  • Upload date:
  • Size: 18.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.2.2 CPython/3.10.14 Darwin/23.2.0

File hashes

Hashes for lm_checkpoints-0.1.13.tar.gz
Algorithm Hash digest
SHA256 61dea63f1280642548457fc4d7ca06dfcaa8f9dfe19e413e2a151a3972a32f44
MD5 e9ce7dc9d3491e9a0d5011d09c393423
BLAKE2b-256 d057fa598154d78489fdbccaf1c3815b01a68747fefdad4cd3d08852b8e4d216

See more details on using hashes here.

File details

Details for the file lm_checkpoints-0.1.13-py3-none-any.whl.

File metadata

  • Download URL: lm_checkpoints-0.1.13-py3-none-any.whl
  • Upload date:
  • Size: 34.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.2.2 CPython/3.10.14 Darwin/23.2.0

File hashes

Hashes for lm_checkpoints-0.1.13-py3-none-any.whl
Algorithm Hash digest
SHA256 49bee1ac9a2367c640bad57b2202f9db26a032f96a0b148f8f026fa40a5f0870
MD5 ffddeb6021957e36c476209a4194fff2
BLAKE2b-256 13c23aa8ecdf9ec07dae222b57cb81ad550f8d3a966ac93626b07fbc3fe6bc54

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page