Skip to main content

A tinyTroch scalar-Engine Nano-autograd a Micro-Framework with a small PyTorch-like neural network library on top.

Project description

Nano-AutoGrad

This project provides a lightweight Python micro-framework for building and training neural networks from scratch based on automatic differentiation and computational graph engine.

Nano-AutoGrad Logo

Installation

Documentation Examples

Introduction

Nano-AutoGrad is a micro-framework that allows you to build and train neural networks from scratch based on automatic differentiation and computational graphs.

Installation

You can install Nano-AutoGrad using pip:

pip install nano-autograds

Features

  1. Nano-AutoGrad offers the following features:

    • Automatic Differentiation: Nano-AutoGrad automatically * computes gradients, making it easy to perform gradient-based optimization.
    • Computational Graph Engine: It leverages a computational graph representation to efficiently compute gradients and perform backpropagation.
    • Lightweight and Efficient: Nano-AutoGrad is designed to be lightweight and efficient, suitable for small to medium-sized neural networks.
    • Easy-to-Use API: The framework provides a simple and intuitive API, allowing users to define and train neural networks with ease.
    • Integration with NumPy: Nano-AutoGrad seamlessly integrates with NumPy, enabling efficient array operations and computations.

Usage

To get started with Nano-AutoGrad, refer to the documentation for detailed usage instructions, examples, and API reference. Here are some basic steps to build and train a neural network using Nano-AutoGrad:

  • examples 1 :

    import numpy as np
    import autograd.core.nn as nn
    import autograd.torch.optim as nn
    
    class MyNeuralNetwork(na.Module):
        def __init__(self):
            self.linear = na.Linear(2, 1)
    
        def forward(self, x):
            return self.linear(x)
    
    network = MyNeuralNetwork()
    optimizer = na.SGD(network.parameters(), lr=0.1)
    
  • Example 2 : building 'Linear Model' using torch autograd engine

    import autograd.torch.nn as nn 
    import autograd.torch.tensor as Tensor
    import autograd.torch.optim as SGD
    import autograd.functiona as F
    
    class Model(nn.Module):
        def __init__(self):
            super().__init__()
            self.l1 = nn.Linear(784, 1568, name='l1')
            self.l2 = nn.Linear(1568, 392, name='l2')
            self.l3 = nn.Linear(392, 10, name='l3')
    
        def forward(self, x):
            z = F.relu(self.l1(x))
            z = F.relu(self.l2(z))
            out = F.log_softmax(self.l3(z))
            return out
    
    model = Model()
    optimizer = autograd.optim.SGD(model.parameters(), lr=5e-2, weight_decay=1e-4)
    scheduler = autograd.optim.lr_scheduler.LinearLR(optimizer, start_factor=1.0, end_factor=0.75, total_iters=num_epochs)
    

Examples

The Nano-AutoGrad repository provides various examples demonstrating the usage of the framework for different tasks, such as linear regression, classification, and more. You can explore the examples directory in the repository to gain a better understanding of how to use Nano-AutoGrad in practice. Contributing please

Nano_AutoGrads_tutorial_Linear_model Open Colab

Nano_AutoGrads_tutorial_Sparse_Networks Open Colab

Using Nano-AutoGrads to classify MINIST handwritten digits Open Colab

Contributions

Nano-AutoGrad are welcome! If you have any bug reports, feature requests, or want to contribute code, please open an issue or submit a pull request on the official GitHub repository. License

Nano-AutoGrad is released under the MIT License. Please see the LICENSE file in the repository for more details. Acknowledgements

We would like to thank the contributors and the open-source community for their valuable contributions to Nano-AutoGrad. Contact

For any inquiries or further information, you can reach out to the project maintainer, Youness El Brag, via email at youness.elbrag@example.com.

Please note that you may need to update the contact email address with the appropriate one.

Credits :

  1. micrograd Andrej karpathy
  2. ugrad conscell

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nano-autograds-1.1.1.tar.gz (16.7 kB view details)

Uploaded Source

Built Distribution

nano_autograds-1.1.1-py3-none-any.whl (19.1 kB view details)

Uploaded Python 3

File details

Details for the file nano-autograds-1.1.1.tar.gz.

File metadata

  • Download URL: nano-autograds-1.1.1.tar.gz
  • Upload date:
  • Size: 16.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.17

File hashes

Hashes for nano-autograds-1.1.1.tar.gz
Algorithm Hash digest
SHA256 4c11044a483a7411bb0c102af50bb1343fa5ac7ad43492a0b293e114af65a6b4
MD5 e9ec4d0e277755f5cfe4c05153610fed
BLAKE2b-256 48cd40458f91de54a6c8b91fcded5883d7bd1b90cc6febcee35ca3d050b9e7a9

See more details on using hashes here.

File details

Details for the file nano_autograds-1.1.1-py3-none-any.whl.

File metadata

File hashes

Hashes for nano_autograds-1.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 c059ed8d942bba2d83900150c74774256afc0aad27c0e846c796b522d56f7699
MD5 e3b28f709131413b39d7d4d780501878
BLAKE2b-256 6fe4edc62cca4371866089cf1167c3a6b172a36a8d546100ad3bc2f35d50aa8e

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page