Skip to main content

A Deep learning framework for scientific and educational purpose

Project description

THUNET: A simple deep learning framework for scientific and education purpose.

  1. Neural networks[1]

    • Layers / Layer-wise ops

      • Add

      • Flatten

      • Multiply

      • Softmax

      • Fully-connected/Dense

      • Sparse evolutionary connections

      • LSTM

      • Elman-style RNN

      • Max + average pooling

      • Dot-product attention

      • Embedding layer

      • Restricted Boltzmann machine (w. CD-n training)

      • 2D deconvolution (w. padding and stride)

      • 2D convolution (w. padding, dilation, and stride)

      • 1D convolution (w. padding, dilation, stride, and causality)

    • Modules

      • Bidirectional LSTM

      • ResNet-style residual blocks (identity and convolution)

      • WaveNet-style residual blocks with dilated causal convolutions

      • Transformer-style multi-headed scaled dot product attention

    • Regularizers

      • Dropout
    • Normalization

      • Batch normalization (spatial and temporal)

      • Layer normalization (spatial and temporal)

    • Optimizers

      • SGD w/ momentum

      • AdaGrad

      • RMSProp

      • Adam

    • Learning Rate Schedulers

      • Constant

      • Exponential

      • Noam/Transformer

      • Dlib scheduler

    • Weight Initializers

      • Glorot/Xavier uniform and normal

      • He/Kaiming uniform and normal

      • Standard and truncated normal

    • Losses

      • Cross entropy

      • Squared error

      • Bernoulli VAE loss

      • Wasserstein loss with gradient penalty

      • Noise contrastive estimation loss

    • Activations

      • ReLU

      • Tanh

      • Affine

      • Sigmoid

      • Leaky ReLU

      • ELU

      • SELU

      • Exponential

      • Hard Sigmoid

      • Softplus

    • Models

      • Bernoulli variational autoencoder

      • Wasserstein GAN with gradient penalty

      • word2vec encoder with skip-gram and CBOW architectures

    • Utilities

      • col2im (MATLAB port)

      • im2col (MATLAB port)

      • conv1D

      • conv2D

      • deconv2D

      • minibatch

  2. BERT

    • Vanilla BERT

    • Simple BERT

  3. REFERENCE

Our contribution is implementation of the vanilla BERT and simple BERT.

All other codes following the licence claimed by (ddbourgin)[https://github.com/ddbourgin] in his (Numpy_ML)![https://github.com/ddbourgin/numpy-ml] project.

  1. Release Frequent Asked Questions
  • Q: Python2.7: LookupError: unknown encoding: cp0

  • A: Setting environment in the shell: set PYTHONIOENCODING=UTF-8

  1. Product Release

Supported Python versions:

| Python |

|--------|

| 2.7 |

| 3.5 |

| 3.6 |

| 3.7 |

| 3.8 |

| 3.9 |

| 3.10 |

[1] David Bourgin. Machine learning, in numpy. https://github.com/ddbourgin/numpy-ml.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

thunet-0.0.10.202209.tar.gz (109.2 kB view details)

Uploaded Source

File details

Details for the file thunet-0.0.10.202209.tar.gz.

File metadata

  • Download URL: thunet-0.0.10.202209.tar.gz
  • Upload date:
  • Size: 109.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.0 CPython/3.9.7

File hashes

Hashes for thunet-0.0.10.202209.tar.gz
Algorithm Hash digest
SHA256 14b3f5f80c924580c97d72cd69be7604c7a2d29cf9c8c01a676953000b4a7d2a
MD5 3444eec4588ad65cfe4b964530035477
BLAKE2b-256 bae5dc2eed215a60bd84db1a3edf1e94f01675e6e83aa6b3d002a931b8a12ef9

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page