FROOG: Fast Real-time Optimization Of Gradients
Project description
froog
froog: fast real-time optimization of gradients
a beautifully compact machine-learning library
homepage | documentation | examples | pip
Installation
pip install froog
Overview of Features
- Tensors
- Automatic Differentiation
- Forward and backward passes
- Input/gradient shape-tracking
- MNIST example
- 2D Convolutions (im2col)
- Numerical gradient checking
- The most common optimizers (SGD, Adam, RMSProp)
Math Operations
- Scalar-Matrix Multiplication
- Dot Product
- Sum
- ReLU
- Log Softmax
- 2D Convolutions
- Avg & Max pooling
- More
Bounties
Want to help but don't know where to start?
Our top bounty is to get EfficientNet v2 model working inside of the examples folder.
Easy
- built in MLP model
- binary cross entropy
- flatten
- batch_norm
- pad
- swish
- dropout
Medium
- simplify how context and gradients are handled
Hard
- efficientNet
- transformers
- stable Diffusion
- winograd Convs
- MPS support
- CUDA support
Contributing
here are some basic guidelines for contributing:
- reduce complexity (currently at 585 lines of code)
- increase speed
- add features, must include tests
- in that order
more info on contributing
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
froog-0.1.7.tar.gz
(14.3 kB
view hashes)
Built Distribution
froog-0.1.7-py3-none-any.whl
(18.6 kB
view hashes)