Skip to main content

A PyTorch Lightning extension that enhances model experimentation with flexible fine-tuning schedules.

Project description

A PyTorch Lightning extension that enhances model experimentation with flexible fine-tuning schedules.


DocsSetupExamplesCommunity

PyPI - Python Version PyPI Status
codecov ReadTheDocs DOI license


FinetuningScheduler explicit loss animation

FinetuningScheduler is simple to use yet powerful, offering a number of features that facilitate model research and exploration:

  • easy specification of flexible fine-tuning schedules with explicit or regex-based parameter selection
    • implicit schedules for initial/naive model exploration
    • explicit schedules for performance tuning, fine-grained behavioral experimentation and computational efficiency
  • automatic restoration of best per-phase checkpoints driven by iterative application of early-stopping criteria to each fine-tuning phase
  • composition of early-stopping and manually-set epoch-driven fine-tuning phase transitions

Setup

Step 0: Install from PyPI

pip install finetuning-scheduler

Step 1: Import the FinetuningScheduler callback and start fine-tuning!

import lightning as L
from finetuning_scheduler import FinetuningScheduler

trainer = L.Trainer(callbacks=[FinetuningScheduler()])

Get started by following the Fine-Tuning Scheduler introduction which includes a CLI-based example or by following the notebook-based Fine-Tuning Scheduler tutorial.


Installation Using the Standalone pytorch-lightning Package

applicable to versions >= 2.0.0

Now that the core Lightning package is lightning rather than pytorch-lightning, Fine-Tuning Scheduler (FTS) by default depends upon the lightning package rather than the standalone pytorch-lightning. If you would like to continue to use FTS with the standalone pytorch-lightning package instead, you can still do so as follows:

Install a given FTS release (for example v2.0.0) using standalone pytorch-lightning:

export FTS_VERSION=2.0.0
export PACKAGE_NAME=pytorch
wget https://github.com/speediedan/finetuning-scheduler/releases/download/v${FTS_VERSION}/finetuning-scheduler-${FTS_VERSION}.tar.gz
pip install finetuning-scheduler-${FTS_VERSION}.tar.gz

Examples

Scheduled Fine-Tuning For SuperGLUE


Continuous Integration

Fine-Tuning Scheduler is rigorously tested across multiple CPUs, GPUs and against major Python and PyTorch versions. Each Fine-Tuning Scheduler minor release (major.minor.patch) is paired with a Lightning minor release (e.g. Fine-Tuning Scheduler 2.0 depends upon Lightning 2.0).

To ensure maximum stability, the latest Lightning patch release fully tested with Fine-Tuning Scheduler is set as a maximum dependency in Fine-Tuning Scheduler's requirements.txt (e.g. <= 1.7.1). If you'd like to test a specific Lightning patch version greater than that currently in Fine-Tuning Scheduler's requirements.txt, it will likely work but you should install Fine-Tuning Scheduler from source and update the requirements.txt as desired.

Current build statuses for Fine-Tuning Scheduler
System / (PyTorch/Python ver) 2.1.2/3.9 2.4.0/3.9, 2.4.0/3.12
Linux [GPUs**] - Build Status
Linux (Ubuntu 22.04) Test Test
OSX (11) Test Test
Windows (2022) Test Test
  • ** tests run on one RTX 4090 and one RTX 2070

Community

Fine-Tuning Scheduler is developed and maintained by the community in close communication with the Lightning team. Thanks to everyone in the community for their tireless effort building and improving the immensely useful core Lightning project.

PR's welcome! Please see the contributing guidelines (which are essentially the same as Lightning's).


Citing Fine-Tuning Scheduler

Please cite:

@misc{Dan_Dale_2022_6463952,
    author       = {Dan Dale},
    title        = {{Fine-Tuning Scheduler}},
    month        = Feb,
    year         = 2022,
    doi          = {10.5281/zenodo.6463952},
    publisher    = {Zenodo},
    url          = {https://zenodo.org/record/6463952}
    }

Feel free to star the repo as well if you find it useful or interesting. Thanks 😊!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

finetuning_scheduler-2.4.0.tar.gz (122.4 kB view details)

Uploaded Source

Built Distribution

finetuning_scheduler-2.4.0-py3-none-any.whl (95.5 kB view details)

Uploaded Python 3

File details

Details for the file finetuning_scheduler-2.4.0.tar.gz.

File metadata

  • Download URL: finetuning_scheduler-2.4.0.tar.gz
  • Upload date:
  • Size: 122.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.0.0 CPython/3.12.5

File hashes

Hashes for finetuning_scheduler-2.4.0.tar.gz
Algorithm Hash digest
SHA256 2e6ca0b0efbf1ebea5c587e2c72b99ae93223d70a5aed624ab4bb82aa21eedc4
MD5 418fecc00cca6bf1ce3f95a5c8818395
BLAKE2b-256 04aacf0c82fa51a4941de8afc6f350651c7ceac5b1bac09833a0e1be00aa7843

See more details on using hashes here.

File details

Details for the file finetuning_scheduler-2.4.0-py3-none-any.whl.

File metadata

File hashes

Hashes for finetuning_scheduler-2.4.0-py3-none-any.whl
Algorithm Hash digest
SHA256 dd75a8d48dfcc9e73d6a6031ad1c8f2cc8ccfbb5bd346b30c7d92f9269ed7cee
MD5 4254f890ed115c4eed7ef9e2907da53e
BLAKE2b-256 199f235b4bd74fd2008d9bf80db1952f9ba574afb322902be84266dd211d7aeb

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page