Skip to main content

Core training module for the Open Language Model (OLMo)

Project description

OLMo-core

Building blocks for OLMo modeling and training

Examples || Docs || PyPI || Beaker Images || License || Changelog

Installation

First install PyTorch according to the instructions specific to your operating system. Then you can install from PyPI with:

pip install ai2-olmo-core

Official training scripts

Official training scripts for various model sizes can be found in src/scripts/train/. To see the exact usage for each script, run the script without any arguments.

Throughput numbers from these scripts with various different configuration settings are reported below, measured on a cluster with NVIDIA H100 GPUs.

Model size Context length Precision Throughput[^1] Training script Commandline overrides                                   
1B 4096 BF16 44,000 TPS OLMo-1B.py
256-8192[^2] BF16 49,000 TPS OLMo-1B.py --dataset.name=vsl
4096 FP8 51,000 TPS OLMo-1B.py --model.float8_config.enabled=true
7B 4096 BF16 10,000 TPS OLMo-7B.py
FP8 13,000 TPS OLMo-7B.py --model.float8_config.enabled=true
13B 4096 BF16 4,600 TPS OLMo-13B.py

[^1]: Throughput reported in tokens per second per device. [^2]: Denotes variable sequence length (VSL) with the Grow-P2 curriculum from Dataset Decomposition: Faster LLM Training with Variable Sequence Length Curriculum.

Development

After cloning OLMo-core and setting up a Python virtual environment, install the codebase from source with:

pip install -e .[all]

The Python library source code is located in src/olmo_core. The corresponding tests are located in src/test. The library docs are located in docs. You can build the docs locally with make docs.

Code checks:

  • We use pytest to run tests. You can run all tests with pytest -v src/test. You can also point pytest at a specific test file to run it individually.
  • We use isort and black for code formatting. Ideally you should integrate these into your editor, but you can also run them manually or configure them with a pre-commit hook. To validate that all files are formatted correctly, run make style-check.
  • We use ruff as our primary linter. You can run it with make lint-check.
  • We use mypy as our type checker. You can run it with make type-check.

Citing

@article{OLMo,
  title={OLMo: Accelerating the Science of Language Models},
  author={Dirk Groeneveld and Iz Beltagy and Pete Walsh and Akshita Bhagia and Rodney Kinney and Oyvind Tafjord and A. Jha and Hamish Ivison and Ian Magnusson and Yizhong Wang and Shane Arora and David Atkinson and Russell Authur and Khyathi Raghavi Chandu and Arman Cohan and Jennifer Dumas and Yanai Elazar and Yuling Gu and Jack Hessel and Tushar Khot and William Merrill and Jacob Daniel Morrison and Niklas Muennighoff and Aakanksha Naik and Crystal Nam and Matthew E. Peters and Valentina Pyatkin and Abhilasha Ravichander and Dustin Schwenk and Saurabh Shah and Will Smith and Emma Strubell and Nishant Subramani and Mitchell Wortsman and Pradeep Dasigi and Nathan Lambert and Kyle Richardson and Luke Zettlemoyer and Jesse Dodge and Kyle Lo and Luca Soldaini and Noah A. Smith and Hanna Hajishirzi},
  year={2024},
  url={https://api.semanticscholar.org/CorpusID:267365485},
  journal={arXiv preprint},
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ai2_olmo_core-1.3.2.tar.gz (136.7 kB view details)

Uploaded Source

Built Distribution

ai2_olmo_core-1.3.2-py3-none-any.whl (159.0 kB view details)

Uploaded Python 3

File details

Details for the file ai2_olmo_core-1.3.2.tar.gz.

File metadata

  • Download URL: ai2_olmo_core-1.3.2.tar.gz
  • Upload date:
  • Size: 136.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.10.15

File hashes

Hashes for ai2_olmo_core-1.3.2.tar.gz
Algorithm Hash digest
SHA256 d05d804edb98268e31c462cd7f941f2cc9218cb5161323f646bea198732f1c5c
MD5 80696702abcb2388a467f1293680d6f1
BLAKE2b-256 97c6d0c209f49bd16c7f3cf709e4173f4637b82aef1b2728a166fd61c44596e1

See more details on using hashes here.

File details

Details for the file ai2_olmo_core-1.3.2-py3-none-any.whl.

File metadata

File hashes

Hashes for ai2_olmo_core-1.3.2-py3-none-any.whl
Algorithm Hash digest
SHA256 8310ea3d3b071b00f04712f8c25aff5c647fdf5121f7a4d4652e0958c6bac7df
MD5 14b078eac4fb9fdd6b75fc798399dd57
BLAKE2b-256 6333098436b8907ebe9c1d2a2d3949d45e79fa032da79ba0722799243fedf304

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page