Skip to main content

Grokfast

Project description

Grokfast - Pytorch (wip)

Explorations into "Grokfast, Accelerated Grokking by Amplifying Slow Gradients", out of Seoul National University in Korea. In particular, will compare it with NAdam on modular addition as well as a few other tasks, since I am curious why those experiments are left out of the paper. If it holds up, will polish it up into a nice package for quick use.

Install

$ pip install grokfast-pytorch

Usage

import torch
from torch import nn

# toy model

model = nn.Linear(10, 1)

# import GrokFastAdamW and instantiate with parameters

from grokfast_pytorch import GrokFastAdamW

opt = GrokFastAdamW(
    model.parameters(),
    lr = 1e-4,
    weight_decay = 0.1
)

# forward and backwards

loss = model(torch.randn(10))
loss.backward()

# optimizer step

opt.step()
opt.zero_grad()

Todo

  • run all experiments on small transformer
    • modular addition
    • pathfinder-x
    • run against nadam and some other optimizers
    • see if exp_avg could be repurposed for amplifying slow grads
  • add the foreach version only if above experiments turn out well

Citations

@inproceedings{Lee2024GrokfastAG,
    title   = {Grokfast: Accelerated Grokking by Amplifying Slow Gradients},
    author  = {Jaerin Lee and Bong Gyun Kang and Kihoon Kim and Kyoung Mu Lee},
    year    = {2024},
    url     = {https://api.semanticscholar.org/CorpusID:270123846}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

grokfast_pytorch-0.0.2.tar.gz (145.6 kB view details)

Uploaded Source

Built Distribution

grokfast_pytorch-0.0.2-py3-none-any.whl (5.1 kB view details)

Uploaded Python 3

File details

Details for the file grokfast_pytorch-0.0.2.tar.gz.

File metadata

  • Download URL: grokfast_pytorch-0.0.2.tar.gz
  • Upload date:
  • Size: 145.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.9.19

File hashes

Hashes for grokfast_pytorch-0.0.2.tar.gz
Algorithm Hash digest
SHA256 43c9490031ab52ea16441137a1ded13170634398a976e83850cd9d6dc1cec932
MD5 d346e50c4725a1c7545e6dbb0b277de8
BLAKE2b-256 4fdaabd28baf00a4116174ebd9ca5cfc415f13e9045debfeb6f98cdd6541201e

See more details on using hashes here.

File details

Details for the file grokfast_pytorch-0.0.2-py3-none-any.whl.

File metadata

File hashes

Hashes for grokfast_pytorch-0.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 4fc4c79cefd034735dfb068c4ee2bc3874c95474b170c42d18d7f40e9af6c28f
MD5 f3ed009227f284c05aca44abaf50f167
BLAKE2b-256 3c9a07f8f8756421ee407b63d12796d1342c5e950b14fc96d1e2ae17a4b880f4

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page