A PyTorch Dataloader compatible batch size scheduler library.
Project description
bs-scheduler
A Batch Size Scheduler library compatible with PyTorch DataLoaders.
Documentation
Why use a Batch Size Scheduler?
- Using a big batch size has several advantages:
- Better hardware utilization.
- Enhanced parallelism.
- Faster training.
- However, using a big batch size from the start may lead to a generalization gap.
- Therefore, the solution is to gradually increase the batch size, similar to a learning rate decay policy.
- See Don't Decay the Learning Rate, Increase the Batch Size.
Available Schedulers
Batch Size Schedulers
LambdaBS
- sets the batch size to the base batch size times a given lambda.MultiplicativeBS
- sets the batch size to the current batch size times a given lambda.StepBS
- multiplies the batch size with a given factor at a given number of steps.MultiStepBS
- multiplies the batch size with a given factor each time a milestone is reached.ConstantBS
- multiplies the batch size by a given factor once and decreases it again to its base value after a given number of steps.LinearBS
- increases the batch size by a linearly changing multiplicative factor for a given number of steps.ExponentialBS
- increases the batch size by a given $\gamma$ each step.PolynomialBS
- increases the batch size using a polynomial function in a given number of steps.CosineAnnealingBS
- increases the batch size to a maximum batch size and decreases it again following a cyclic cosine curve.IncreaseBSOnPlateau
- increases the batch size each time a given metric has stopped improving for a given number of steps.CyclicBS
- cycles the batch size between two boundaries with a constant frequency, while also scaling the distance between boundaries.CosineAnnealingBSWithWarmRestarts
- increases the batch size to a maximum batch size following a cosine curve, then restarts while also scaling the number of iterations until the next restart.OneCycleBS
- decreases the batch size to a minimum batch size then increases it to a given maximum batch size, following a linear or cosine annealing strategy.SequentialBS
- calls a list of schedulers sequentially given a list of milestone points which reflect which scheduler should be called when.ChainedBSScheduler
- chains a list of batch size schedulers and calls them together each step.
Installation
Please install PyTorch first before installing this repository.
pip install bs-scheduler
Licensing
The library is licensed under the BSD-3-Clause license.
Citation
To be added...
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
bs_scheduler-0.5.1.tar.gz
(26.1 kB
view details)
Built Distribution
File details
Details for the file bs_scheduler-0.5.1.tar.gz
.
File metadata
- Download URL: bs_scheduler-0.5.1.tar.gz
- Upload date:
- Size: 26.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.9.20
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 42c7ac2413108f3e79375effefa740c078caf642ad1fd42cb2170ca89451618f |
|
MD5 | b58261a30550179e77ada0122371b5e0 |
|
BLAKE2b-256 | 62387fceae7576ae6f651add3781e040e118ce569c6f3d9442872530d07fcda1 |
File details
Details for the file bs_scheduler-0.5.1-py3-none-any.whl
.
File metadata
- Download URL: bs_scheduler-0.5.1-py3-none-any.whl
- Upload date:
- Size: 17.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.9.20
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 709cb74bf04ee7a2928951a5f241b40cd8ef24947e625fa0cfe2fd8d187464fb |
|
MD5 | ea62efec83e4302004779fc49b97e823 |
|
BLAKE2b-256 | ae2bdb0056e9786caa9e92d190dd116308bec616cc59d0c2fd029c7dfc2665a1 |