Skip to main content

Simple library for loading checkpoints of language models.

Project description

🤖🚩lm-checkpoints

Simple library for dealing with language model checkpoints to study training dynamics.

lm-checkpoints should make it easier to work with intermediate training checkpoints that are provided for some language models (LMs), like MultiBERTs and Pythia. This library allows you to iterate over the training steps, to define different subsets, to automatically clear the cache for previously seen checkpoints, etc. Nothing fancy, simply a wrapper for 🤗 models that should make it easier to study their training dynamics.

Install using pip install lm-checkpoints.

Checkpoints

Currently implemented for the following models on HuggingFace:

Example

Say you want to compute some metrics for all model checkpoints of Pythia 160m, but only seed 0.

from lm_checkpoints import PythiaCheckpoints

for ckpt in PythiaCheckpoints(size=160,seed=[0]):
    # Do something with ckpt.model, ckpt.config or ckpt.tokenizer
    print(ckpt.config)

Or if you only want to load steps 0, 1, 2, 4, 8, 16 for all available seeds:

from lm_checkpoints import PythiaCheckpoints

for ckpt in PythiaCheckpoints(size=160,step=[0, 1, 2, 4, 8, 16]):
    # Do something with ckpt.model, ckpt.config or ckpt.tokenizer
    print(ckpt.config)

Alternatively, you may want to load all final checkpoints of MultiBERTs:

from lm_checkpoints import MultiBERTCheckpoints

for ckpt in MultiBERTCheckpoints.final_checkpoints():
    # Do something with ckpt.model, ckpt.config or ckpt.tokenizer
    print(ckpt.config)

Loading "chunks" of checkpoints for parallel computations

It is possible to split the checkpoints in N "chunks", e.g., useful if you want to run computations in parallel:

chunks = []
checkpoints = PythiaCheckpoints(size=160,seed=[0])
for chunk in checkpoints.split(N):
    chunks.append(chunk)

Dealing with limited disk space

In case you don't want the checkpoints to fill up your disk space, use clean_cache=True to delete earlier checkpoints when iterating over these models (NB: You have to redownload these if you run it again!):

from lm_checkpoints import PythiaCheckpoints

for ckpt in PythiaCheckpoints(size=14,clean_cache=True):
    # Do something with ckpt.model or ckpt.tokenizer

Evaluating checkpoints using lm-evaluation-harness

If you install lm-checkpoints with the eval option (pip install "lm-checkpoints[eval]"), you can use the evaluate function to run lm-evaluation-harness for all checkpoints:

from lm_checkpoints import evaluate, PythiaCheckpoints

ckpts = PythiaCheckpoints(size=14, step=[0, 1, 2, 4], seed=[0], device="cuda")

evaluate(
    ckpts,
    tasks=["triviaqa", "crows_pairs_english"],
    output_dir="test_results",
    log_samples=True,
    skip_if_exists=True,
#    limit=5, # For testing purposes!
)

Or you can use the evaluate_checkpoints script:

evaluate_checkpoints pythia --output test_results --size 14 --seed 1 --step 0 1 2 --tasks blimp crows_pairs_english --device cuda --skip_if_exists

Both examples will create a subdirectory structure in test_results/ for each model and step. This will contain a results json file (e.g., results_crows_pairs_english,triviaqa.json), and if using the --log_samples option, a jsonl file containing the LM responses to the individual test items for each task (e.g., samples_triviaqa.jsonl).

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lm_checkpoints-0.1.15.tar.gz (18.0 kB view details)

Uploaded Source

Built Distribution

lm_checkpoints-0.1.15-py3-none-any.whl (34.5 kB view details)

Uploaded Python 3

File details

Details for the file lm_checkpoints-0.1.15.tar.gz.

File metadata

  • Download URL: lm_checkpoints-0.1.15.tar.gz
  • Upload date:
  • Size: 18.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.2.2 CPython/3.10.14 Darwin/23.2.0

File hashes

Hashes for lm_checkpoints-0.1.15.tar.gz
Algorithm Hash digest
SHA256 7a358436f764c865a99277803271c62f834ae85287f888459ef995324d19c94d
MD5 09d90868b5984748c33c2727dae0f883
BLAKE2b-256 a8f46b4ef30c0ed16dd446423ddaf4c4dac74ffe4e35a8e559d5a287c3ddbf33

See more details on using hashes here.

File details

Details for the file lm_checkpoints-0.1.15-py3-none-any.whl.

File metadata

  • Download URL: lm_checkpoints-0.1.15-py3-none-any.whl
  • Upload date:
  • Size: 34.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.2.2 CPython/3.10.14 Darwin/23.2.0

File hashes

Hashes for lm_checkpoints-0.1.15-py3-none-any.whl
Algorithm Hash digest
SHA256 2056f8e5da826155437869dd31c5560be89dd4f0a0f7107cfcbe75c172c7a2bc
MD5 853b6425b0d01e992600f30da1e28855
BLAKE2b-256 c84de33de821b3c414daba200ff8a181782114a85c33601ac93285c58db7ce99

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page