A keras inspired deep learning framework.
Project description
AinShamsFlow
4th CSE Neural Networks Project.
Contents:
Project Description:
AinShamsFlow (asf) is a Deep Learning Framework built by our Team from Ain Shams University during December 2020 - January 2021.
The Design and Interface is inspired heavily from Keras and TensorFlow.
However, we only implement everything from scratch using only simple libraries such as numpy and matplotlib.
Project Structure:
The Design of all parts can be seen in the following UML Diagram.
This is how the Design Structure should work in our Framework.
Install:
You can install the latest available version as follows:
$ pip install ainshamsflow
Usage:
you start using this project by importing the package as follows:
>>> import ainshamsflow as asf
then you can start creating a model:
>>> model = asf.models.Sequential([
... asf.layers.Dense(300, activation="relu"),
... asf.layers.Dense(100, activation="relu"),
... asf.layers.Dense( 10, activation="softmax")
... ], input_shape=(784,), name="my_model")
>>> model.print_summary()
then compile and train the model:
>>> model.compile(
... optimizer=asf.optimizers.SGD(lr=0.01),
... loss='sparsecategoricalcrossentropy',
... metrics=['accuracy']
... )
>>> history = model.fit(X_train, y_train, epochs=100)
finally you can evaluate, predict and show training statistics:
>>> history.show()
>>> model.evaluate(X_valid, y_valid)
>>> y_pred = model.predict(X_test)
A more elaborate example usage can be found in main.py or check out this demo notebook.
Team Members:
- Pierre Nabil
- Ahmed Taha
- Girgis Micheal
- Hazzem Mohammed
- Ibrahim Shoukry
- John Bahaa
- Michael Magdy
- Ziad Tarek
Todo:
Framework Design:
-
A Data Module to read and process datasets.
-
Dataset
- __init__()
- __bool__()
- __len__()
- __iter__()
- __next__()
- apply()
- numpy()
- batch()
- cardinality()
- concatenate()
- copy()
- enumerate()
- filter()
- map()
- range()
- shuffle()
- skip()
- split()
- take()
- unbatch()
- add_data()
- add_targets()
- normalize()
-
ImageDataGenerator
- __init__()
- flow_from_directory()
-
-
A NN Module to design different architectures.
-
Activation Functions
- Linear
- Sigmoid
- Hard Sigmoid
- Tanh
- Hard Tanh
- ReLU
- LeakyReLU
- ELU
- SELU
- Softmax
- Softplus
- Softsign
- Swish
-
Layers
-
DNN Layers:
- Dense
- BatchNorm
- Dropout
-
CNN Layers 1D: (optional)
- Conv
- Pool (Avg and Max)
- GlobalPool (Avg and Max)
- Upsample
-
CNN Layers 2D:
- Conv
- Pool (Avg and Max)
- FastConv
- FastPool (Avg and Max)
- GlobalPool (Avg and Max)
- Upsample
-
CNN Layers 3D: (optional)
- Conv
- Pool (Avg and Max)
- GlobalPool (Avg and Max)
- Upsample
-
Other Extra Functionality:
- Flatten
- Activation
- Reshape
-
-
Initializers
- Constant
- Uniform
- Normal
- Identity
-
Losses
- MSE (Mean Squared Error)
- MAE (Mean Absolute Error)
- MAPE (Mean Absolute Percentage Error)
- BinaryCrossentropy
- CategoricalCrossentropy
- SparseCategoricalCrossentropy
- HuberLoss
- LogLossLinearActivation
- LogLossSigmoidActivation
- PerceptronCriterionLoss
- SvmHingeLoss
-
Evaluation Metrics
- Accuracy
- TP (True Positives)
- TN (True Negatives)
- FP (False Positives)
- FN (False Negatives)
- Precision
- Recall
- F1Score
-
Regularizers
- L1
- L2
- L1_L2
-
Optimization Modules for training
- SGD
- Momentum
- AdaGrad
- RMSProp
- AdaDelta
- Adam
-
A Visualization Modules to track the training and testing processes
- History Class for showing training statistics
-
verbose
parameter in training - live plotting of training statistics
-
A utils module for reading and saving models
-
Adding CUDA support
-
Publish to PyPI
-
Creating a Documentation for the Project
-
Example Usage:
This part can be found in the demo notebook mentioned above.
- Download and Split a dataset (MNIST or CIFAR-10) to training, validation and testing
- Construct an Architecture (LeNet or AlexNet) and make sure all of its components are provided in your framework.
- Train and test the model until a good accuracy is reached (Evaluation Metrics will need to be implemented in the framework also)
- Save the model into a compressed format
Change Log
0.1.0 (29/1/2021)
- First Release
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for ainshamsflow-0.1.0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | fb27689c72d680b22ca2fb968b9cc1ca584046723184976a169ca89d2f531271 |
|
MD5 | 7691fd3782905c406c8ec37294f88531 |
|
BLAKE2b-256 | 0e0e9770da797a9dab14ab52c7b5ac68e2bcb22b2ada850b0245728f8d318a05 |