A simplistic and efficient pure-python neural network library that allows to build multilayer neural network with ease.
Project description
MultiLayer Neural Network (ak_pynn)
A simplistic and efficient pure-python neural network library, which can be used to build , visualize and deploy deep learning ANN models. It is optimized for best performance.
- Optimized for performance
- Better Visualization
- Cross platform
Authors
License
Support
For support, email contact.ankitkohli@gmail.com
Features
-
Efficient implementations of activation functions and their gradients
- Sigmoid
- ReLU
- Leaky ReLU
- Softmax
- Softplus
- Tanh
- Elu
- Linear
-
Efficient implementations of loss functions and their gradients
- Mean squared error
- Mean absolute error
- Binary cross entropy
- Categorical cross entropy
-
Several methods for weights initialization
-
'random uniform'
,'random normal'
-
'Glorot Uniform'
,'Glorot Normal'
-
'He Uniform'
,'He Normal'
-
-
Neural network optimization using
- Gradient Descent (Batch/ SGD / Mini-Batch)
- Momentum
- Adagrad
- RMSprop
- Adam
-
Regularizations
- L1 Norm
- L2 Norm
- L1_L2 Norm
- Dropouts
-
Batch Normalization
-
Early Stopping
-
Validation Splits
-
Predict Scores
Installation
Install the release (stable) version from PyPi
pip install ak-pynn
Usage/Examples
Import
from ak_pynn.mlp import MLP
Usage
model = MLP()
model.add_layer(4,input_layer=True)
model.add_layer(10,activation_function='relu',batch_norm=True)
model.add_layer(10,activation_function='relu',dropouts=True)
model.add_layer(10,activation_function='relu')
model.add_layer(3,activation_function='softmax',output_layer=True)
model.compile_model(optimizer='Adam',loss_function='mse',metrics=['mse','accuracy'])
Output
( MODEL SUMMARY )
===================================================================
Layer Activation Output Shape Params
===================================================================
Input linear (None, 4) 0
-------------------------------------------------------------------
Dense relu (None, 10) 50
-------------------------------------------------------------------
BatchNormalization None (None, 10) 40
-------------------------------------------------------------------
Dense relu (None, 10) 110
-------------------------------------------------------------------
Dropout None (None, 10) 0
-------------------------------------------------------------------
Dense relu (None, 10) 110
-------------------------------------------------------------------
Output softmax (None, 3) 33
-------------------------------------------------------------------
===================================================================
Total Params - 343
Trainable Params - 323
Non-Trainable Params - 20
___________________________________________________________________
Visualizing model
model.visualize()
Training the model
model.fit(X_train, Y_train,epochs=200,batch_size=32,verbose=False,early_stopping=False,patience=3,validation_split=0.2)
model.predict_scores(X_test,Y_test,metrics=['accuracy','precision','macro_recall'])
plt.plot(model.history['Val_Losses'])
plt.plot(model.history['Losses'])
TESTS
Citation
If you use this library and would like to cite it, you can use:
Ankit kohli, "ak-pynn: Neural Network libray", 2023. [Online]. Available: https://github.com/ankit869/ak-pynn. [Accessed: DD- Month- 20YY].
or:
@Misc{,
author = {Ankit kohli},
title = {ak-pynn: Neural Network libray},
month = May,
year = {2023},
note = {Online; accessed <today>},
url = {https://github.com/ankit869/ak-pynn},
}