Skip to main content

A simplistic and efficient pure-python neural network library that allows to build multilayer neural network with ease.

Project description

MultiLayer Neural Network (ak_pynn)

A simplistic and efficient pure-python neural network library, which can be used to build , visualize and deploy deep learning ANN models. It is optimized for best performance.

  • Optimized for performance
  • Better Visualization
  • Cross platform

Authors

License

MIT

Support

For support, email contact.ankitkohli@gmail.com

Features

  • Efficient implementations of activation functions and their gradients

    • Sigmoid
    • ReLU
    • Leaky ReLU
    • Softmax
    • Softplus
    • Tanh
    • Elu
    • Linear
  • Efficient implementations of loss functions and their gradients

    • Mean squared error
    • Mean absolute error
    • Binary cross entropy
    • Categorical cross entropy
  • Several methods for weights initialization

    • 'random uniform', 'random normal'
    • 'Glorot Uniform', 'Glorot Normal'
    • 'He Uniform','He Normal'
  • Neural network optimization using

    • Gradient Descent (Batch/ SGD / Mini-Batch)
    • Momentum
    • Adagrad
    • RMSprop
    • Adam
  • Regularizations

    • L1 Norm
    • L2 Norm
    • L1_L2 Norm
    • Dropouts
  • Batch Normalization

  • Early Stopping

  • Validation Splits

  • Predict Scores

Installation

Install the release (stable) version from PyPi

pip install ak-pynn

Usage/Examples

Import

from ak_pynn.mlp import MLP

Usage

model = MLP()
model.add_layer(4,input_layer=True)
model.add_layer(10,activation_function='relu',batch_norm=True)
model.add_layer(10,activation_function='relu',dropouts=True)
model.add_layer(10,activation_function='relu')
model.add_layer(3,activation_function='softmax',output_layer=True)
model.compile_model(optimizer='Adam',loss_function='mse',metrics=['mse','accuracy'])

Output


                                ( MODEL SUMMARY )                        
        
        ===================================================================
               Layer           Activation    Output Shape      Params    
        ===================================================================

               Input             linear       (None, 4)          0       
        -------------------------------------------------------------------

               Dense              relu        (None, 10)         50      
        -------------------------------------------------------------------

         BatchNormalization       None        (None, 10)         40      
        -------------------------------------------------------------------

               Dense              relu        (None, 10)        110      
        -------------------------------------------------------------------

              Dropout             None        (None, 10)         0       
        -------------------------------------------------------------------

               Dense              relu        (None, 10)        110      
        -------------------------------------------------------------------

               Output           softmax       (None, 3)          33      
        -------------------------------------------------------------------

        ===================================================================

        Total Params  - 343
        Trainable Params  - 323
        Non-Trainable Params  - 20
        ___________________________________________________________________
              

Visualizing model

model.visualize()

App Screenshot

Training the model

model.fit(X_train, Y_train,epochs=200,batch_size=32,verbose=False,early_stopping=False,patience=3,validation_split=0.2)
model.predict_scores(X_test,Y_test,metrics=['accuracy','precision','macro_recall'])
plt.plot(model.history['Val_Losses'])
plt.plot(model.history['Losses'])

TESTS

@mnist_test

@iris_test

@mlp_demo

Citation

If you use this library and would like to cite it, you can use:

Ankit kohli, "ak-pynn: Neural Network libray", 2023. [Online]. Available: https://github.com/ankit869/ak-pynn. [Accessed: DD- Month- 20YY].

or:

@Misc{,
  author = {Ankit kohli},
  title  = {ak-pynn: Neural Network libray},
  month  = May,
  year   = {2023},
  note   = {Online; accessed <today>},
  url    = {https://github.com/ankit869/ak-pynn},
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ak_pynn-0.1.8.tar.gz (16.7 kB view details)

Uploaded Source

Built Distribution

ak_pynn-0.1.8-py3-none-any.whl (15.1 kB view details)

Uploaded Python 3

File details

Details for the file ak_pynn-0.1.8.tar.gz.

File metadata

  • Download URL: ak_pynn-0.1.8.tar.gz
  • Upload date:
  • Size: 16.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.4

File hashes

Hashes for ak_pynn-0.1.8.tar.gz
Algorithm Hash digest
SHA256 9b0b618102724a5f21b44decd092f7b10a52a886e8d18f5d6db656000add2619
MD5 67e8bddb2705a4462aacd7309d0c5053
BLAKE2b-256 465f42c8e950376e88a25406fa849d9a30651cfd70ab373b19d604633e5b09a4

See more details on using hashes here.

File details

Details for the file ak_pynn-0.1.8-py3-none-any.whl.

File metadata

  • Download URL: ak_pynn-0.1.8-py3-none-any.whl
  • Upload date:
  • Size: 15.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.4

File hashes

Hashes for ak_pynn-0.1.8-py3-none-any.whl
Algorithm Hash digest
SHA256 31b7bf8332ec94e686cc4b2c61bea840b446f9c3c7801fbb6f25a731a0496db5
MD5 95ab3a9a64385bf66724745d11bd8db7
BLAKE2b-256 4e08ccd4ee4289ebb27fdc1770077df5adec52cdd63f0a225814a779e0a3109a

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page