FROG: Fast Real-time Optimization of Gradients
Project description
frog
a beautifully compact machine-learning library
modern ml development is unintuitive, time consuming, and unaccessible. why not make it possible for anyone to build?
Overview of Features
- Tensors
- Automatic Differentiation
- Forward and backward passes
- Input/gradient shape-tracking
- MNIST example
- 2D Convolutions (im2col)
- Gradient checking
- The most common optimizers (SGD, Adam, RMSProp)
Math Operations
- Scalar-Matrix Multiplication
- Dot Product
- Sum
- ReLU
- Log Softmax
- 2D Convolution
Bounties
We really want to get a useful model working right out of the box! Our top bounty is to get EfficientNet v2 model working inside of the examples folder.
- EfficientNet v2 (top priority)
Easy
- built in MLP model
- binary cross entropy
- dropout layer
- flatten
Medium
- publish to pip3
- simplify how context and gradients are handled
Hard
- Transformers
- Stable Diffusion
- Winograd Convs
- MPS support
- CUDA support
Contributing
Here are some basic guidelines for contributing:
- Reduce code
- Increase speed
- Add features
- In that order
Bug fixes are the best and always welcome Conceptual cleanups are great All features must include tests
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
froog-0.1.2.tar.gz
(10.4 kB
view hashes)
Built Distribution
froog-0.1.2-py3-none-any.whl
(10.4 kB
view hashes)