Skip to main content

PyDyNet: Neuron Network (MLP, CNN, RNN, Transformer, ...) implementation using Numpy with Autodiff

Project description

PyDyNet:NumPy-based Dynamic Deep Learning Framework

Chinese README: cnREADME.md

Downloads Downloads x x x x

Towards Large Language Model

In the summer of 2025, I restart the development of PyDyNet after two years. PyDyNet implemented a pure inference version of Llama3 (6-layer Transformer, vocab-size=32000). The implementation is inspired by the NumPy version and dataset available here. To run it, download the dataset into the llm/llama folder and execute:

>>> python -m llm.llama.infer

There was a boy named Timmy. He loved to play with hi toy and run around outside. One day, Timmy' mom asked him to help her with the laundry. Timmy didn't want to help because he wanted to play. But hi mom said, "Timmy, you need to help me. It' important to help out."
Timmy didn't want to help, but he knew he had to. So, he put on hi shoe and went outside to help hi mom. A they were folding the clothe, Timmy saw a big pile of laundry on the floor. He wanted to help, so he started to pick it up. But then, he accidentally knocked over a pile of clothe and they fell on him. Timmy wa okay, but he felt bad.
Hi mom saw what happened and said, "Timmy, you need to be more careful. You could have hurt yourself." Timmy felt bad and said sorry. Hi mom hugged him and said, "It' okay, accident happen. Let' clean up the laundry together." Timmy learned that it' important to be careful and help out when you need it.

Token count: 262, elapsed: 0.87s, 300 tokens/s

We also implemented a pure inference version of CLIP, inspired by the NumPy version and dataset available NPCLIP. To run it, imigrate data folder of MPCLIP into llm/clip folder and execute:

>>> python -m llm.clip.infer
Label probs: [0.000953   0.48176003 0.51728696]

for the following image and query ["a fish", "a dog", "a cat"]

cat_dog

Overview

PyDyNet is a neural network framework implemented entirely in NumPy (with CuPy support since version 0.0.7, using the same API). Its syntax is inspired by PyTorch, and its structure is as follows:

graph LR
   N(numpy/cupy.ndarray)--Backend--> A(Tensor) --> ds(Dataset) ---> Data(DataLoader)---> Mission
   A  --Eager execution--> B(Basic operators:<br> add, exp, etc)
   B -.Autograd-.-> A

   B --> CO(Complex<br>operators)
   --> f(Function:<br>img2col, etc) 
   --> M(Basic Module:<br>Linear, etc)
   --> CM(Advanced Module: CNN, RNN, Transformer, etc)
   --> Mission(Learning task)
   A --> GD(Optimizer:<br> SGD, Adam, etc) ---> LS(lr_scheduler: <br>StepLR, etc)---> Mission

Dashed lines indicate that users can disable automatic differentiation using no_grad.

Install

git clone https://github.com/Kaslanarian/PyDyNet
cd PyDyNet
python setup.py install

We are actively working on a pip installation option.

Example

Examples can be found in the examples/pydynet directory, with equivalent PyTorch implementations in examples/pytorch. To run an example, use:

python -m examples.pydynet.xxx

Automatic Differentiation

The example autodiff1d.py demonstrates automatic differentiation by performing gradient descent on a one-dimensional convex function:

ad1

A multi-variable convex function example is provided in autodiff2d.py:

ad2

MLP & LeNet

The example mlp_cnn.py uses MLP and LeNet to classify MNIST digits. The training and testing accuracies are shown below:

dnn

Dropout & Batch Normalization

The example mlp_dropout_bn.py compares the performance of three networks on the fetch_olivetti_faces dataset (64×64 pixel images):

  1. Three-layer MLP;
  2. Three-layer MLP with Dropout;
  3. Three-layer MLP with Batch Normalization.
cnn

Recurrent Neural Network (RNN)

The example ts_prediction.py demonstrates time series prediction using a GRU:

RNN

Transformer

The example transformer.py shows how to train a text classification model using a Transformer. The training results are as follows:

transformer

Dataset (CoLA) link: https://nyu-mll.github.io/CoLA/cola_public_1.1.zip

Cuda Acceleration

PyDyNet supports CUDA acceleration through CuPy. To use it, simply install CuPy and use the same API as NumPy. We compare the performance of PyDyNet with CuPy and NumPy as follows on Nvidia GeForce RTX 4090:

Network structure Dataset CPU time (s) per epoch GPU time (s) per epoch
3-layer MLP MNIST (80000×574) 7.256±0.138 1.203±.0181
LeNet MNIST (80000×574) 239.664±2.108 2.841±0.026
1-layer Transformer (dim=512, head=4) CoLA (8551×45×64) 17.503±0.251 1.075±0.002

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pydynet-1.0.tar.gz (29.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pydynet-1.0-py3-none-any.whl (32.6 kB view details)

Uploaded Python 3

File details

Details for the file pydynet-1.0.tar.gz.

File metadata

  • Download URL: pydynet-1.0.tar.gz
  • Upload date:
  • Size: 29.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for pydynet-1.0.tar.gz
Algorithm Hash digest
SHA256 15e1ddce6b11a4c4a398dc55f38b00ce608544a42d54ac429dfa77d48ba782d7
MD5 5113276ed7816b41ab867f44e448c26f
BLAKE2b-256 ca24ff65e353496d2690967ea18294d1e08150f6c461ad68e8ca12f372e3ce9c

See more details on using hashes here.

Provenance

The following attestation bundles were made for pydynet-1.0.tar.gz:

Publisher: python-publish.yml on WeltXing/PyDyNet

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file pydynet-1.0-py3-none-any.whl.

File metadata

  • Download URL: pydynet-1.0-py3-none-any.whl
  • Upload date:
  • Size: 32.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for pydynet-1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 fad11cda4fe043e318c412a47530c090e3186b34a25bfe530fc6b75527c712b4
MD5 ac207262f3ab7db93382528cc6237b66
BLAKE2b-256 9fda2fac620fe22e8669b602b3bb09c7f0c8217198f27468f6fb81fde2931eb2

See more details on using hashes here.

Provenance

The following attestation bundles were made for pydynet-1.0-py3-none-any.whl:

Publisher: python-publish.yml on WeltXing/PyDyNet

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page