Skip to main content

A keras inspired deep learning framework.

Project description

AinShamsFlow

4th CSE Neural Networks Project.

asf_logo

Contents:

Project Description:

AinShamsFlow (asf) is a Deep Learning Framework built by our Team from Ain Shams University during December 2020 - January 2021.

The Design and Interface is inspired heavily from Keras and TensorFlow.

However, we only implement everything from scratch using only simple libraries such as numpy and matplotlib.

Project Structure:

The Design of all parts can be seen in the following UML Diagram.

UML Diagram of the Project

This is how the Design Structure should work in our Framework.

Desgin Structure

Install:

You can install the latest available version as follows:

$ pip install ainshamsflow

Usage:

you start using this project by importing the package as follows:

>>> import ainshamsflow as asf

then you can start creating a model:

>>> model = asf.models.Sequential([
...     asf.layers.Dense(300, activation="relu"),
...     asf.layers.Dense(100, activation="relu"),
...     asf.layers.Dense( 10, activation="softmax")
... ], input_shape=(784,), name="my_model")
>>> model.print_summary()

then compile and train the model:

>>> model.compile(
... 	optimizer=asf.optimizers.SGD(lr=0.01),
... 	loss='sparsecategoricalcrossentropy',
...     metrics=['accuracy']
... )
>>> history = model.fit(X_train, y_train, epochs=100)

finally you can evaluate, predict and show training statistics:

>>> history.show()
>>> model.evaluate(X_valid, y_valid)
>>> y_pred = model.predict(X_test)

A more elaborate example usage can be found in main.py or check out this demo notebook.

Open In Colab

Team Members:

Todo:

Framework Design:

  • A Data Module to read and process datasets.

    • Dataset

      • __init__()
      • __bool__()
      • __len__()
      • __iter__()
      • __next__()
      • apply()
      • numpy()
      • batch()
      • cardinality()
      • concatenate()
      • copy()
      • enumerate()
      • filter()
      • map()
      • range()
      • shuffle()
      • skip()
      • split()
      • take()
      • unbatch()
      • add_data()
      • add_targets()
      • normalize()
    • ImageDataGenerator

      • __init__()
      • flow_from_directory()
  • A NN Module to design different architectures.

    • Activation Functions

      • Linear
      • Sigmoid
      • Hard Sigmoid
      • Tanh
      • Hard Tanh
      • ReLU
      • LeakyReLU
      • ELU
      • SELU
      • Softmax
      • Softplus
      • Softsign
      • Swish
    • Layers

      • DNN Layers:

        • Dense
        • BatchNorm
        • Dropout
      • CNN Layers 1D: (optional)

        • Conv
        • Pool (Avg and Max)
        • GlobalPool (Avg and Max)
        • Upsample
      • CNN Layers 2D:

        • Conv
        • Pool (Avg and Max)
        • FastConv
        • FastPool (Avg and Max)
        • GlobalPool (Avg and Max)
        • Upsample
      • CNN Layers 3D: (optional)

        • Conv
        • Pool (Avg and Max)
        • GlobalPool (Avg and Max)
        • Upsample
      • Other Extra Functionality:

        • Flatten
        • Activation
        • Reshape
    • Initializers

      • Constant
      • Uniform
      • Normal
      • Identity
    • Losses

      • MSE (Mean Squared Error)
      • MAE (Mean Absolute Error)
      • MAPE (Mean Absolute Percentage Error)
      • BinaryCrossentropy
      • CategoricalCrossentropy
      • SparseCategoricalCrossentropy
      • HuberLoss
      • LogLossLinearActivation
      • LogLossSigmoidActivation
      • PerceptronCriterionLoss
      • SvmHingeLoss
    • Evaluation Metrics

      • Accuracy
      • TP (True Positives)
      • TN (True Negatives)
      • FP (False Positives)
      • FN (False Negatives)
      • Precision
      • Recall
      • F1Score
    • Regularizers

      • L1
      • L2
      • L1_L2
    • Optimization Modules for training

      • SGD
      • Momentum
      • AdaGrad
      • RMSProp
      • AdaDelta
      • Adam
    • A Visualization Modules to track the training and testing processes

      • History Class for showing training statistics
      • verbose parameter in training
      • live plotting of training statistics
    • A utils module for reading and saving models

    • Adding CUDA support

    • Publish to PyPI

    • Creating a Documentation for the Project

Example Usage:

This part can be found in the demo notebook mentioned above.

  • Download and Split a dataset (MNIST or CIFAR-10) to training, validation and testing
  • Construct an Architecture (LeNet or AlexNet) and make sure all of its components are provided in your framework.
  • Train and test the model until a good accuracy is reached (Evaluation Metrics will need to be implemented in the framework also)
  • Save the model into a compressed format

Change Log

0.1.0 (29/1/2021)

  • First Release

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ainshamsflow-0.1.0.tar.gz (279.6 kB view details)

Uploaded Source

Built Distribution

ainshamsflow-0.1.0-py3-none-any.whl (31.9 kB view details)

Uploaded Python 3

File details

Details for the file ainshamsflow-0.1.0.tar.gz.

File metadata

  • Download URL: ainshamsflow-0.1.0.tar.gz
  • Upload date:
  • Size: 279.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.6.1 requests/2.24.0 setuptools/51.1.2 requests-toolbelt/0.9.1 tqdm/4.48.2 CPython/3.8.4rc1

File hashes

Hashes for ainshamsflow-0.1.0.tar.gz
Algorithm Hash digest
SHA256 826ccbc2c2088d143f7d42dcf6e4efd13f3bbbf6b361e69374b886d852e0c2ef
MD5 55af540c17252f8022f212cc20aae1d4
BLAKE2b-256 3a3d4637d250605662b9a84513f71572e7a4b0f9c9f5f3fd314ee5e304735f10

See more details on using hashes here.

File details

Details for the file ainshamsflow-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: ainshamsflow-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 31.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.6.1 requests/2.24.0 setuptools/51.1.2 requests-toolbelt/0.9.1 tqdm/4.48.2 CPython/3.8.4rc1

File hashes

Hashes for ainshamsflow-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 fb27689c72d680b22ca2fb968b9cc1ca584046723184976a169ca89d2f531271
MD5 7691fd3782905c406c8ec37294f88531
BLAKE2b-256 0e0e9770da797a9dab14ab52c7b5ac68e2bcb22b2ada850b0245728f8d318a05

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page