a simple autograd library
Project description
Simple Torch
implement by numpy
autograd
tensor
Realize the function of basic operations between tensors,also,recording dependencies between tensors and the grad of tensors.
function
Realize the function of activation function
parameter
Quickly create random tensors with requires_grad=True.
module
Realize the function of recording parameters
optim
Achieve optimization of the model
tests
Test for autograd function
Games to test performance
fizz_buzz
simple_learned_function
minimize_a_function
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Torch-Yottaxx-0.1.2.tar.gz
(4.8 kB
view hashes)
Built Distribution
Close
Hashes for Torch_Yottaxx-0.1.2-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 855f38b2bb8a99866f02237db75d48d4689274695210a0d7e9a51d86c4a782f2 |
|
MD5 | e3e65bac5d5f45a4dbae396ecb5e1d60 |
|
BLAKE2b-256 | fd96e8992009d4e20cad2221a0dc228a0ecd2102035dfcd350ca60d991c2bbea |