Skip to main content

A PyTorch Lightning extension that enhances model experimentation with flexible finetuning schedules.

Project description

A PyTorch Lightning extension that enhances model experimentation with flexible finetuning schedules.


DocsSetupExamplesCommunity

PyPI - Python Version PyPI Status Conda (channel only)
codecov ReadTheDocs DOI license


FinetuningScheduler explicit loss animation

FinetuningScheduler is simple to use yet powerful, offering a number of features that facilitate model research and exploration:

  • easy specification of flexible finetuning schedules with explicit or regex-based parameter selection
    • implicit schedules for initial/naive model exploration
    • explicit schedules for performance tuning, fine-grained behavioral experimentation and computational efficiency
  • automatic restoration of best per-phase checkpoints driven by iterative application of early-stopping criteria to each finetuning phase
  • composition of early-stopping and manually-set epoch-driven finetuning phase transitions

Setup

Step 0: Install from PyPI

pip install finetuning-scheduler

Step 1: Import the FinetuningScheduler callback and start finetuning!

from pytorch_lightning import Trainer
from finetuning_scheduler import FinetuningScheduler

trainer = Trainer(callbacks=[FinetuningScheduler()])

Get started by following the Finetuning Scheduler introduction which includes a CLI-based example or by following the notebook-based Finetuning Scheduler tutorial.


Examples

Scheduled Finetuning For SuperGLUE


Continuous Integration

Finetuning Scheduler is rigorously tested across multiple CPUs, GPUs and against major Python and PyTorch versions. Each Finetuning Scheduler minor release (major.minor.patch) is paired with a PyTorch Lightning minor release (e.g. Finetuning Scheduler 0.1 depends upon PyTorch Lightning 1.6).

To ensure maximum stability, the latest PyTorch Lightning patch release fully tested with Finetuning Scheduler is set as a maximum dependency in Finetuning Scheduler's requirements.txt (e.g. <= 1.6.1). If you'd like to test a specific PyTorch Lightning patch version greater than that currently in Finetuning Scheduler's requirements.txt, it will likely work but you should install Finetuning Scheduler from source and update the requirements.txt as desired.

Current build statuses for Finetuning Scheduler
System / PyTorch ver 1.8 (LTS, min. req.) 1.11 (latest)
Linux py3.9 [GPUs**] Build Status -
Linux py3.{7,9} Test Test
OSX py3.{7,9} Test Test
Windows py3.{7,9} Test Test
  • ** tests run on two RTX 2070s

Community

Finetuning Scheduler is developed and maintained by the community in close communication with the PyTorch Lightning team. Thanks to everyone in the community for their tireless effort building and improving the immensely useful core PyTorch Lightning project.

PR's welcome! Please see the contributing guidelines (which are essentially the same as PyTorch Lightning's).


Citing Finetuning Scheduler

Please cite:

@misc{Dan_Dale_2022_6463952,
    author       = {Dan Dale},
    title        = {{Finetuning Scheduler}},
    month        = Feb,
    year         = 2022,
    doi          = {10.5281/zenodo.6463952},
    publisher    = {Zenodo},
    url          = {https://zenodo.org/record/6463952}
    }

Feel free to star the repo as well if you find it useful or interesting. Thanks 😊!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

finetuning-scheduler-0.1.4.tar.gz (49.0 kB view hashes)

Uploaded Source

Built Distribution

finetuning_scheduler-0.1.4-py3-none-any.whl (49.5 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page