No project description provided
Project description
MicroTorch
MicroTorch is a minimal implementation of PyTorch , NumPy, TensorFlow, and other popular machine learning libraries. It is intended to be used as a learning tool for understanding the how is works neural networks ,autograd, and other machine learning concepts.
Installation
pip install microtorch
Usage
import microtorch as mt
# Create a tensor 1 dimension
x = mt.Tensor([1, 2, 3, 4, 5])
# Create a tensor 2 dimension
x = mt.Tensor([[1, 2, 3], [4, 5, 6]])
# Create a tensor 3 dimension
x = mt.Tensor([[[1, 2, 3], [4, 5, 6]], [[7, 8, 9], [10, 11, 12]]])
# Create a tensor 4 dimension
x = mt.Tensor([[[[1, 2, 3], [4, 5, 6]], [[7, 8, 9], [10, 11, 12]]], [[[1, 2, 3], [4, 5, 6]], [[7, 8, 9], [10, 11, 12]]]])
# create a tensor with random values
x = mt.rand(2, 3)
# create a tensor with zeros
x = mt.zeros(2, 3)
# create a tensor with ones
x = mt.ones(2, 3)
# create a tensor with a range
x = mt.arange(0, 10)
# create a tensor with a range and step
x = mt.arange(0, 10, 2)
# create a tensor with a range and step
x = mt.arange(0, 10, 2)
# create a tensor with a range and step
x = mt.arange(0, 10, 2)
# create neural network
model = mt.Sequential(
mt.Linear(2, 5),
mt.ReLU(),
mt.Linear(5, 1),
mt.Sigmoid()
)
# create loss function
loss_fn = mt.MSELoss()
# create optimizer
optimizer = mt.SGD(model.parameters(), lr=0.01)
# train model
for epoch in range(100):
# forward
y_pred = model(x)
# compute loss
loss = loss_fn(y_pred, y)
# zero gradients
optimizer.zero_grad()
# backward
loss.backward()
# update weights
optimizer.step()
License
Contributing
Pull requests are welcome. For major changes, please open an issue first to discuss what you would like to change.
Authors
- mustafa bozkaya - Initial work -
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
microtorch-0.1.1.tar.gz
(1.6 kB
view hashes)
Built Distribution
Close
Hashes for microtorch-0.1.1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 1d2b7749a956fb3867afbb91bcc7c75a28fef1eb2f1c927681d889e4d81c3ea6 |
|
MD5 | fe54deb2a7070ed216c07553fe9711f0 |
|
BLAKE2b-256 | 6037ec4ae5de943ed349386ac8a0128ce6f9bebbc3d44e4129d6b1a716d24336 |