nanotorch: Small-scale implementation of PyTorch from the ground up.
Project description
Etymology: nano (Small) and torch (PyTorch)
Small-scale implementation of PyTorch from the ground up.
This project, a miniature implementation of the PyTorch library, is crafted with the primary goal of elucidating the intricate workings of neural network libraries. It serves as a pedagogical tool for those seeking to unravel the mathematical complexities and the underlying architecture that powers such sophisticated libraries.
NOTE: This project is based on the excellent work done by Andrej Karpathy in his micrograd project.
Installation
Install the latest version of NanoTorch using pip:
pip install -U git+https://github.com/xames3/nanotorch.git#egg=nanotorch
Objective
The cornerstone of this endeavor is to provide a hands-on learning experience by replicating key components of PyTorch, thereby granting insights into its functional mechanisms. This bespoke implementation focuses on the core aspects of neural network computation, including tensor operations, automatic differentiation, and basic neural network modules.
Features
1. Tensor Operations: At the heart of this implementation lie tensor operations, which are the building blocks of any neural network library. As of now, our tensors support basic arithmetic functionalities found in PyTorch.
>>> a = nanotorch.tensor(2.0)
>>> b = nanotorch.tensor(3.0)
>>> a + b
tensor(5.0)
>>> a - 6
tensor(-4.0)
>>> c = a + b
>>> c += 2 * a / b
>>> c = c ** 3
>>> nanotorch.arange(5)
[tensor(0), tensor(1), tensor(2), tensor(3), tensor(4)]
>>> nanotorch.arange(1, 4)
[tensor(1), tensor(2), tensor(3)]
>>> nanotorch.arange(1, 2.5, 0.5)
[tensor(1), tensor(1.5), tensor(2.)]
2. Automatic Differentiation: A pivotal feature of this project is a simplistic version of automatic differentiation, akin to PyTorch’s autograd. It allows for the computation of gradients automatically, which is essential for training neural networks.
>>> c.backward()
>>> print(a.grad) # prints 200.55 as the gradient with respect to c i.e dc/da
3. Neural Network Modules: The implementation includes rudimentary neural network modules such as linear layers and activation functions. These modules can be composed to construct simple neural network architectures.
4. Optimizers and Loss Functions: Basic optimizers like SGD and common loss functions are included to facilitate the training process of neural networks.
Educational Value
This project stands as a testament to the educational philosophy of learning by doing. It is particularly beneficial for:
Students and enthusiasts who aspire to gain a profound understanding of the inner workings of neural network libraries.
Developers and researchers seeking to customize or extend the functionalities of existing deep learning libraries for their specific requirements.
Usage and Documentation
The codebase is structured to be intuitive and mirrors the design principles of PyTorch to a significant extent. Comprehensive docstrings are provided for each module and function, ensuring clarity and ease of understanding. Users are encouraged to delve into the code, experiment with it, and modify it to suit their learning curve.
Contributions and Feedback
Contributions to this project are warmly welcomed. Whether it’s refining the code, enhancing the documentation, or extending the current feature set, your input is highly valued. Feedback, whether constructive criticism or commendation, is equally appreciated and will be instrumental in the evolution of this educational tool.
Acknowledgments
This project is inspired by the remarkable work done by the PyTorch development team. It is a tribute to their contributions to the field of machine learning and the open-source community at large.
Project Links
Source Code: https://github.com/xames3/nanotorch
Issue Tracker: https://github.com/xames3/nanotorch/issues
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file nanotorch-2.1.0.tar.gz
.
File metadata
- Download URL: nanotorch-2.1.0.tar.gz
- Upload date:
- Size: 18.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.0.0 CPython/3.11.5
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | d1a92649d7a852738e22c9cc456a83141e6d3e77abcedf6ffc07bff81a757108 |
|
MD5 | 77e3008a8d34d716c62727590646c483 |
|
BLAKE2b-256 | 7ca50b8c3ad9250ba900722a38c4aac783a2c84a79429d470295be00c3d963b7 |
File details
Details for the file nanotorch-2.1.0-py3-none-any.whl
.
File metadata
- Download URL: nanotorch-2.1.0-py3-none-any.whl
- Upload date:
- Size: 21.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.0.0 CPython/3.11.5
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 5fc42e88789fee069fd502ac6e78eeb365d47ee6edbeb404994047a3bd1f1af4 |
|
MD5 | 3b584f2a6d47453c622018f1f153646a |
|
BLAKE2b-256 | 46577cdaba221f2987878678f24be69a19b671c4c5e91ba51e8bccd476f19aef |