Skip to main content

Core training module for the Open Language Model (OLMo)

Project description

OLMo-core

Building blocks for OLMo modeling and training

Examples || Docs || PyPI || Docker Images || Beaker Images || License || Changelog

Installation

First install PyTorch according to the instructions specific to your operating system and hardware. Then you can install from PyPI with:

pip install ai2-olmo-core

There are a number of optional dependencies that must be installed to use certain functionality as well, including:

  • flash-attn for flash attention and certain other fused operations.
  • torchao for float8 training.
  • megablocks for mixture-of-experts (MoE) models.

The published Docker images contain all core and optional dependencies, and are regularly tested on our in-house H100 clusters. But there are several things to keep in mind if you intend to use these images:

  • They do not come with the OLMo-core package installed, only its dependencies, to accommodate for regular code changes.
  • They may not work on your own cluster if you have different hardware or driver/CUDA versions.

If the published images do not work for your use-case for any of the above reasons, you could adapt our Dockerfile to build your own images.

API stability

Even though this library is under rapid development we are trying hard to adhere to Semantic Versioning with every release except for features that are explicitly marked as beta features. Those features will be tagged like this in the API docs:

image

Official training scripts

Official training scripts for various model sizes can be found in src/scripts/train/. To see the exact usage for each script, run the script without any arguments.

Throughput numbers from these scripts with various different configuration settings are reported below, measured on a cluster with NVIDIA H100 GPUs.

Model size Model arch.   Context length Precision Throughput[^1] Training   script Commandline overrides                                   
1B OLMo-1124 4096 BF16 55,000 TPS OLMo2-1B.py
4096 BF16/FP8[^2] 65,000 TPS OLMo2-1B.py --model.float8_config.enabled=true
7B OLMo-1124 4096 BF16 10,000 TPS OLMo2-7B.py
4096 BF16/FP8 13,000 TPS OLMo2-7B.py --model.float8_config.enabled=true
8B Llama 4096 BF16 9,500 TPS Llama3-8B.py
4096 BF16/FP8 12,500 TPS Llama3-8B.py --model.float8_config.enabled=true
13B OLMo-1124 4096 BF16 4,600 TPS OLMo2-13B.py
4096 BF16/FP8 5,500 TPS OLMo2-13B.py --model.float8_config.enabled=true

[^1]: Throughput reported in tokens per second per device. [^2]: In this setup most matrix multiplications are computed in float8, everything else is in bfloat16.

Development

After cloning OLMo-core and setting up a Python virtual environment, install the codebase from source with:

pip install -e .[all]

The Python library source code is located in src/olmo_core. The corresponding tests are located in src/test. The library docs are located in docs. You can build the docs locally with make docs.

Code checks:

  • We use pytest to run tests. You can run all tests with pytest -v src/test. You can also point pytest at a specific test file to run it individually.
  • We use isort and black for code formatting. Ideally you should integrate these into your editor, but you can also run them manually or configure them with a pre-commit hook. To validate that all files are formatted correctly, run make style-check.
  • We use ruff as our primary linter. You can run it with make lint-check.
  • We use mypy as our type checker. You can run it with make type-check.

Citing

@article{OLMo2,
  title={2 OLMo 2 Furious},
  author={Team OLMo and Pete Walsh and Luca Soldaini and Dirk Groeneveld and Kyle Lo and Shane Arora and Akshita Bhagia and Yuling Gu and Shengyi Huang and Matt Jordan and Nathan Lambert and Dustin Schwenk and Oyvind Tafjord and Taira Anderson and David Atkinson and Faeze Brahman and Christopher Clark and Pradeep Dasigi and Nouha Dziri and Michal Guerquin and Hamish Ivison and Pang Wei Koh and Jiacheng Liu and Saumya Malik and William Merrill and Lester James Validad Miranda and Jacob Daniel Morrison and Tyler C. Murray and Crystal Nam and Valentina Pyatkin and Aman Rangapur and Michael Schmitz and Sam Skjonsberg and David Wadden and Chris Wilhelm and Michael Wilson and Luke S. Zettlemoyer and Ali Farhadi and Noah A. Smith and Hanna Hajishirzi},
  year={2024},
  url={https://api.semanticscholar.org/CorpusID:275213098},
  journal={arXiv preprint},
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ai2_olmo_core-1.9.0.tar.gz (184.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ai2_olmo_core-1.9.0-py3-none-any.whl (218.8 kB view details)

Uploaded Python 3

File details

Details for the file ai2_olmo_core-1.9.0.tar.gz.

File metadata

  • Download URL: ai2_olmo_core-1.9.0.tar.gz
  • Upload date:
  • Size: 184.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.16

File hashes

Hashes for ai2_olmo_core-1.9.0.tar.gz
Algorithm Hash digest
SHA256 d6a26c4e2b8715b2a91b3d1e41f276bbeafde1453530ef5da7b938eb849bb51e
MD5 a3e2452bb177a4f4086457ff181b9981
BLAKE2b-256 cb7dd731ff7634afe0117bb92a27e1fd639db2153c40d7978d62100314d1339d

See more details on using hashes here.

File details

Details for the file ai2_olmo_core-1.9.0-py3-none-any.whl.

File metadata

  • Download URL: ai2_olmo_core-1.9.0-py3-none-any.whl
  • Upload date:
  • Size: 218.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.16

File hashes

Hashes for ai2_olmo_core-1.9.0-py3-none-any.whl
Algorithm Hash digest
SHA256 f35186df204ca4712559c63740f7d04fdbdbc3e1fe312f9b48e279830e5849ea
MD5 a4993b56bb1505fc2cf7e7bf7b6d0ceb
BLAKE2b-256 f120bcf2754ee84cce5c09ede8e0eca38298210638b2f02f5dd947e95df969ea

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page