Skip to main content

A PyTorch Lightning extension that enhances model experimentation with flexible fine-tuning schedules.

Project description

A PyTorch Lightning extension that enhances model experimentation with flexible fine-tuning schedules.


DocsSetupExamplesCommunity

PyPI - Python Version PyPI Status
codecov ReadTheDocs DOI license


FinetuningScheduler explicit loss animation

FinetuningScheduler is simple to use yet powerful, offering a number of features that facilitate model research and exploration:

  • easy specification of flexible fine-tuning schedules with explicit or regex-based parameter selection
    • implicit schedules for initial/naive model exploration
    • explicit schedules for performance tuning, fine-grained behavioral experimentation and computational efficiency
  • automatic restoration of best per-phase checkpoints driven by iterative application of early-stopping criteria to each fine-tuning phase
  • composition of early-stopping and manually-set epoch-driven fine-tuning phase transitions

Setup

Step 0: Install from PyPI

Starting with version 2.10, uv is the preferred installation approach for Fine-Tuning Scheduler.

# Install uv if you haven't already (one-time setup)
curl -LsSf https://astral.sh/uv/install.sh | sh

# Install Fine-Tuning Scheduler
uv pip install finetuning-scheduler

Step 1: Import the FinetuningScheduler callback and start fine-tuning!

import lightning as L
from finetuning_scheduler import FinetuningScheduler

trainer = L.Trainer(callbacks=[FinetuningScheduler()])

Get started by following the Fine-Tuning Scheduler introduction which includes a CLI-based example or by following the notebook-based Fine-Tuning Scheduler tutorial.


Installation Using the Standalone pytorch-lightning Package

applicable to versions >= 2.0.0

Now that the core Lightning package is lightning rather than pytorch-lightning, Fine-Tuning Scheduler (FTS) by default depends upon the lightning package rather than the standalone pytorch-lightning. If you would like to continue to use FTS with the standalone pytorch-lightning package instead, you can still do so as follows:

Install a given FTS release (for example v2.0.0) using standalone pytorch-lightning:

export FTS_VERSION=2.0.0
export PACKAGE_NAME=pytorch
wget https://github.com/speediedan/finetuning-scheduler/releases/download/v${FTS_VERSION}/finetuning-scheduler-${FTS_VERSION}.tar.gz
uv pip install finetuning-scheduler-${FTS_VERSION}.tar.gz

Dynamic Versioning

FTS (as of version 2.6.0) now enables dynamic versioning both at installation time and via CLI post-installation. Initially, the dynamic versioning system allows toggling between Lightning unified and standalone imports. The two conversion operations are individually idempotent and mutually reversible.

Toggling Between Unified and Standalone Lightning Imports

FTS provides a simple CLI tool to easily toggle between unified and standalone import installation versions post-installation:

# Toggle from unified to standalone Lightning imports
toggle-lightning-mode --mode standalone

# Toggle from standalone to unified Lightning imports (default)
toggle-lightning-mode --mode unified

Note: If you have the standalone package (pytorch-lightning) installed but not the unified package (lightning), toggling to unified mode will be prevented. You must install the lightning package first before toggling.

This can be useful when:

  • You need to adapt existing code to work with a different Lightning package
  • You're switching between projects using different Lightning import styles
  • You want to test compatibility with both import styles

Examples

Scheduled Fine-Tuning For SuperGLUE


Continuous Integration

Fine-Tuning Scheduler is rigorously tested across multiple CPUs, GPUs and against major Python and PyTorch versions.

Versioning Policy (Updated in 2.9): Starting with the 2.9 minor release, Fine-Tuning Scheduler is pivoting from tight Lightning version alignment to core PyTorch version alignment. This change:

  • Provides greater flexibility to integrate the latest PyTorch functionality increasingly important in research
  • Reduces maintenance burden while continuing to support the stable Lightning API and robust integration
  • Officially supports at least the latest 4 PyTorch minor releases (e.g., when PyTorch 2.9 is released, FTS supports >= 2.6)

This versioning approach is motivated by Lightning's evolving release cadence (see Lightning Issue #21073 and PR #21107) and allows FTS to adopt new PyTorch capabilities more rapidly while maintaining clear deprecation policies.

See the versioning documentation for complete details on compatibility policies and migration guidance.

Prior Versioning (< 2.9): Each Fine-Tuning Scheduler minor release (major.minor.patch) was paired with a Lightning minor release (e.g., Fine-Tuning Scheduler 2.0 depends upon Lightning 2.0). To ensure maximum stability, the latest Lightning patch release fully tested with Fine-Tuning Scheduler was set as a maximum dependency in Fine-Tuning Scheduler's requirements.txt (e.g., <= 1.7.1).

Current build statuses for Fine-Tuning Scheduler
System / (PyTorch/Python ver) 2.6.0/3.10 2.10.0/3.10, 2.10.0/3.13
Linux [GPUs**] - Build Status
Linux (Ubuntu 22.04) Test Test
OSX (14) Test Test
Windows (2022) Test Test
  • ** tests run on one RTX 4090 and one RTX 2070

Community

Fine-Tuning Scheduler is developed and maintained by the community in close communication with the Lightning team. Thanks to everyone in the community for their tireless effort building and improving the immensely useful core Lightning project.

PR's welcome! Please see the contributing guidelines (which are essentially the same as Lightning's).


Citing Fine-Tuning Scheduler

Please cite:

@misc{Dan_Dale_2022_6463952,
    author       = {Dan Dale},
    title        = {{Fine-Tuning Scheduler}},
    month        = Feb,
    year         = 2022,
    doi          = {10.5281/zenodo.6463952},
    publisher    = {Zenodo},
    url          = {https://zenodo.org/record/6463952}
    }

Feel free to star the repo as well if you find it useful or interesting. Thanks 😊!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

finetuning_scheduler-2.10.0.tar.gz (128.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

finetuning_scheduler-2.10.0-py3-none-any.whl (135.9 kB view details)

Uploaded Python 3

File details

Details for the file finetuning_scheduler-2.10.0.tar.gz.

File metadata

  • Download URL: finetuning_scheduler-2.10.0.tar.gz
  • Upload date:
  • Size: 128.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for finetuning_scheduler-2.10.0.tar.gz
Algorithm Hash digest
SHA256 24e64e064231a8a9bfc6c01fd1d0208ef4b9551f411d6ee8467a74986eea5264
MD5 446eaa90b77deea01a7458c960352ddf
BLAKE2b-256 9841676e419ef1dd45b36187855f8b09c747fdf3993499b144f2f6257af4f673

See more details on using hashes here.

Provenance

The following attestation bundles were made for finetuning_scheduler-2.10.0.tar.gz:

Publisher: release-pypi.yml on speediedan/finetuning-scheduler

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file finetuning_scheduler-2.10.0-py3-none-any.whl.

File metadata

File hashes

Hashes for finetuning_scheduler-2.10.0-py3-none-any.whl
Algorithm Hash digest
SHA256 83394dce968392491094cea8df84e4b92f0e0f02529ef56d63108144d9351cde
MD5 c55567529733b0f84033b226848bac83
BLAKE2b-256 a3976951f7abb2366673ddf5cbe17047704969915e8dafe0580a0b3cf1905c5a

See more details on using hashes here.

Provenance

The following attestation bundles were made for finetuning_scheduler-2.10.0-py3-none-any.whl:

Publisher: release-pypi.yml on speediedan/finetuning-scheduler

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page