Skip to main content

Grokfast

Project description

Grokfast - Pytorch (wip)

Explorations into "Grokfast, Accelerated Grokking by Amplifying Slow Gradients", out of Seoul National University in Korea. In particular, will compare it with NAdam on modular addition as well as a few other tasks, since I am curious why those experiments are left out of the paper. If it holds up, will polish it up into a nice package for quick use.

The official repository can be found here

Install

$ pip install grokfast-pytorch

Usage

import torch
from torch import nn

# toy model

model = nn.Linear(10, 1)

# import GrokFastAdamW and instantiate with parameters

from grokfast_pytorch import GrokFastAdamW

opt = GrokFastAdamW(
    model.parameters(),
    lr = 1e-4,
    weight_decay = 1e-2
)

# forward and backwards

loss = model(torch.randn(10))
loss.backward()

# optimizer step

opt.step()
opt.zero_grad()

Todo

  • run all experiments on small transformer
    • modular addition
    • pathfinder-x
    • run against nadam and some other optimizers
    • see if exp_avg could be repurposed for amplifying slow grads
  • add the foreach version only if above experiments turn out well

Citations

@inproceedings{Lee2024GrokfastAG,
    title   = {Grokfast: Accelerated Grokking by Amplifying Slow Gradients},
    author  = {Jaerin Lee and Bong Gyun Kang and Kihoon Kim and Kyoung Mu Lee},
    year    = {2024},
    url     = {https://api.semanticscholar.org/CorpusID:270123846}
}
@misc{kumar2024maintaining,
    title={Maintaining Plasticity in Continual Learning via Regenerative Regularization},
    author={Saurabh Kumar and Henrik Marklund and Benjamin Van Roy},
    year={2024},
    url={https://openreview.net/forum?id=lyoOWX0e0O}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

grokfast_pytorch-0.0.7.tar.gz (146.0 kB view details)

Uploaded Source

Built Distribution

grokfast_pytorch-0.0.7-py3-none-any.whl (5.6 kB view details)

Uploaded Python 3

File details

Details for the file grokfast_pytorch-0.0.7.tar.gz.

File metadata

  • Download URL: grokfast_pytorch-0.0.7.tar.gz
  • Upload date:
  • Size: 146.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.9.19

File hashes

Hashes for grokfast_pytorch-0.0.7.tar.gz
Algorithm Hash digest
SHA256 ce8715ba4373da6ac516144f625c9577d84ef72853db7cde28251ef047c937d2
MD5 a7860d7bd5196c539655a08d64deada3
BLAKE2b-256 d032f229bd18c5f6a19e6b8259aafe64ea5f3bceeee3c75af39e8cd482a056de

See more details on using hashes here.

File details

Details for the file grokfast_pytorch-0.0.7-py3-none-any.whl.

File metadata

File hashes

Hashes for grokfast_pytorch-0.0.7-py3-none-any.whl
Algorithm Hash digest
SHA256 f710f886f21dddbfa376d270ddf97fcb4ae37727b12c2feb8ab0d48c1def3bbe
MD5 aee5ec1a7aa643ba2a9c96c646060f5d
BLAKE2b-256 4fcb9324f6c665138ee81ef593ab651acabbb40a68f291cb1b0fa8056f7bd28b

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page