FROG: Fast Real-time Optimization of Gradients
Project description
frog ![unit test badge](https://pypi-camo.freetls.fastly.net/c79af04210eabcb40b90b5285895a99966c104ef/68747470733a2f2f6769746875622e636f6d2f6b65766275682f66726f672f616374696f6e732f776f726b666c6f77732f746573742e796d6c2f62616467652e737667)
frog: fast real-time optimization of gradients
a beautifully compact machine-learning library
homepage | documentation | examples| pip
Installation
pip install froog
Overview of Features
- Tensors
- Automatic Differentiation
- Forward and backward passes
- Input/gradient shape-tracking
- MNIST example
- 2D Convolutions (im2col)
- Gradient checking
- The most common optimizers (SGD, Adam, RMSProp)
Math Operations
- Scalar-Matrix Multiplication
- Dot Product
- Sum
- ReLU
- Log Softmax
- 2D Convolution
Bounties
We really want to get a useful model working right out of the box! Our top bounty is to get EfficientNet v2 model working inside of the examples folder.
- EfficientNet v2 (top priority)
Easy
- built in MLP model
- binary cross entropy
- dropout layer
- flatten
Medium
- simplify how context and gradients are handled
Hard
- Transformers
- Stable Diffusion
- Winograd Convs
- MPS support
- CUDA support
Contributing
Here are some basic guidelines for contributing:
- Reduce code, currently at 585 lines
- Increase speed
- Add features
- In that order
Bug fixes are the best and always welcome Conceptual cleanups are great All features must include tests
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
froog-0.1.4.tar.gz
(10.5 kB
view hashes)
Built Distribution
froog-0.1.4-py3-none-any.whl
(10.5 kB
view hashes)