Skip to main content

GPT training in Lightning

Project description

lightning-GPT

lightning-GPT is a minimal wrapper around Andrej Karpathy's minGPT and nanoGPT in Lightning.

It is aimed at providing a minimal Lightning layer on top of minGPT and nanoGPT, while leveraging the full breadth of Lightning.

There are currently a few options:

  • MinGPT: the GPT model from minGPT vanilla (set --implementation=mingpt)
  • NanoGPT: the GPT model from nanoGPT vanilla (set --implementation=nanogpt)
  • DeepSpeedMinGPT: the GPT model from minGPT made DeepSpeed-ready (set --strategy=deepspeed)
  • DeepSpeedNanoGPT: the GPT model from nanoGPT made DeepSpeed-ready (set --strategy=deepspeed)
  • FSDPMinGPT: the GPT model from minGPT made FSDP (native)-ready (set --strategy=fsdp-gpt)
  • FSDPNanoGPT: the GPT model from nanoGPT made FSDP (native)-ready (set --strategy=fsdp-gpt)

minGPT and nanoGPT are vendored with the repo in the mingpt and nanogpt directories respectively. Find the respective LICENSE there.

Thanks to:

  • @karpathy for the original minGPT and nanoGPT implementation
  • @williamFalcon for the first Lightning port
  • @SeanNaren for the DeepSpeed pieces

Installation

There are two main ways to install this package.

Installation from source is preferred if you need the latest version with yet unreleased changes, want to use the provided benchmarking or training suites or need to adjust the package.

Installation from PyPI is preferred if you just want to use a stable version of the package without any modifications.

Installation from PyPI

To install the package, simply run

pip install lightning-gpt

Installation from source

To clone the repository, please clone the repo with

git clone https://github.com/Lightning-AI/lightning-GPT && cd lightning-GPT
git submodule update --init --recursive

and install with

pip install -e .

After this you can proceed with the following steps.

MinGPT

First install the dependencies

pip install -r requirements.txt

then

python train.py

See

python train.py --help

for the available flags.

NanoGPT

First install the dependencies.

pip install -r requirements.txt
pip install -r requirements/nanogpt.txt

then

python train.py

See

python train.py --help

for the available flags.

DeepSpeed

Install the extra-dependencies:

pip install -r requirements/deepspeed.txt

and pass the strategy flag to the script

python train.py --implementation mingpt --strategy deepspeed

or

python train.py --implementation nanogpt --strategy deepspeed

FSDP native

Pass the strategy flag to the script

python train.py --implementation mingpt --strategy fsdp_native

or

python train.py --implementation nanogpt --strategy fsdp_native

PyTorch 2.0

To run on dynamo/inductor from the PyTorch 2.0 compiler stack, run

python train.py --compile dynamo

Note that you will need a recent torch nightly (1.14.x) for torch.compile to be available.

Credits

License

Apache 2.0 license https://opensource.org/licenses/Apache-2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lightning_gpt-0.1.1.tar.gz (595.9 kB view details)

Uploaded Source

Built Distribution

lightning_gpt-0.1.1-py3-none-any.whl (466.6 kB view details)

Uploaded Python 3

File details

Details for the file lightning_gpt-0.1.1.tar.gz.

File metadata

  • Download URL: lightning_gpt-0.1.1.tar.gz
  • Upload date:
  • Size: 595.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.11.6

File hashes

Hashes for lightning_gpt-0.1.1.tar.gz
Algorithm Hash digest
SHA256 514800fdce06a9a01fe29095fd2a60075010eda1ff5e6e4e8a563b8e6a2cd3a1
MD5 e7afb174645a5da45512d6badacfb2e5
BLAKE2b-256 35fdeaf9c5d7b3934e4ee504f5cf778ada80c5e1c242b6092ffabf2ba90ce2cc

See more details on using hashes here.

File details

Details for the file lightning_gpt-0.1.1-py3-none-any.whl.

File metadata

File hashes

Hashes for lightning_gpt-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 aa450e87f4716145d8a97fbbb838dc5fc380647e728e3acb218d7b3042285487
MD5 454c41460c0598e63513f6b1c87a40da
BLAKE2b-256 5c2a7ea9993d758899cd33d4b5faa73eff933f7034dc2b1ce4b5a7e5e099ca86

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page