a beautifully simplistic ml framework
Project description
froog
froog: fast real-time optimization of gradients
a beautifully compact machine-learning library
homepage | documentation | pip
froog is a SUPER SIMPLE machine learning framework with the goal of creating tools with AI --> easily and efficiently.
froog encapsulates everything from linear regression to convolutional neural networks
all of this in under 1000 lines.
Installation
pip install froog
Overview of Features
- Custom Tensors
- Backpropagation
- Automatic Differentiation (autograd)
- Forward and backward passes
- ML Operations
- 2D Convolutions (im2col)
- Numerical gradient checking
- Acceleration methods (Adam)
- Avg & Max pooling
- EfficientNet inference
- GPU Support
- and a bunch more
Sneak Peek
from froog.tensor import Tensor
from froog.nn import Linear
import froog.optim as optim
class mnistMLP:
def __init__(self):
self.l1 = Tensor(Linear(784, 128))
self.l2 = Tensor(Linear(128, 10))
def forward(self, x):
return x.dot(self.l1).relu().dot(self.l2).logsoftmax()
model = mnistMLP()
optim = optim.SGD([model.l1, model.l2], lr=0.001)
Bounties
THERES LOT OF STUFF TO WORK ON! VISIT THE BOUNTY SHOP
Pull requests will be merged if they:
- increase simplicity
- increase functionality
- increase efficiency
more info on contributing
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
froog-0.2.7.tar.gz
(16.7 kB
view hashes)
Built Distribution
froog-0.2.7-py3-none-any.whl
(18.1 kB
view hashes)