Skip to main content

Deep Learning framework from scratch

Project description

DeepStorm: Deep Learning Framework

Summary:

Deep Learning Framework from scratch, with an API of a combination of pytorch and keras APIs, only uses numpy for tensor operations.

Pip install:

pip install DeepStorm

Layers & DL classes in framework:

  • Conv2d
  • MaxPool2d
  • BatchNorm2d
  • Flatten
  • Dropout
  • Linear
  • ReLU
  • Softmax
  • SgdWithMomentum
  • Adam
  • CrossEntropyLoss
  • Xavier
  • He

Model building:

layers = [
    Conv2d(in_channels=1, out_channels=32,
           kernel_size=3, stride=1, padding='same'),
    BatchNorm2d(32),
    Dropout(probability=0.3),
    ReLU(),

    Conv2d(in_channels=32, out_channels=64,
           kernel_size=3, stride=1, padding='same'),
    BatchNorm2d(64),
    ReLU(),
    MaxPool2d(kernel_size=2, stride=2),

    Conv2d(in_channels=64, out_channels=64,
           kernel_size=3, stride=1, padding='same'),
    BatchNorm2d(64),
    ReLU(),
    MaxPool2d(kernel_size=2, stride=2),

    Flatten(),

    Linear(in_features=64*7*7, out_features=128),
    ReLU(),
    Linear(128, 64),
    ReLU(),
    Linear(64, 10),
    SoftMax(),
]

model = Model(layers)

Or

model = Model()

model.append_layer(Conv2d(in_channels=1, out_channels=32,
                          kernel_size=3, stride=1, padding='same'))
model.append_layer(BatchNorm2d(32))
model.append_layer(ReLU())
model.append_layer(Conv2d(in_channels=32, out_channels=64,
                          kernel_size=3, stride=1, padding='same'))
model.append_layer(BatchNorm2d(64))
model.append_layer(ReLU())
model.append_layer(MaxPool2d(kernel_size=2, stride=2))

model.append_layer(Conv2d(in_channels=64, out_channels=64,
                          kernel_size=3, stride=1, padding='same'))
model.append_layer(BatchNorm2d(64))
model.append_layer(ReLU())
model.append_layer(MaxPool2d(kernel_size=2, stride=2))
model.append_layer(Flatten())
model.append_layer(Linear(in_features=64*7*7, out_features=128))
model.append_layer(ReLU())
model.append_layer(Linear(in_features=128, out_features=64))
model.append_layer(ReLU())
model.append_layer(Linear(in_features=64, out_features=10))
model.append_layer(SoftMax())

Model compile:

batch_size = 16
model.compile(optimizer=Adam(learning_rate=5e-3, mu=0.98, rho=0.999), loss=CrossEntropyLoss(),
              batch_size=batch_size, metrics=['accuracy'])

Model training:

epochs = 25
history = model.fit(x_train=train_images, y_train=train_labels, x_val=val_images, y_val=val_labels, epochs=epochs)

Model performance:

plt.plot(history['accuracy'])
plt.plot(history['val_accuracy'])
plt.title('model accuracy')
plt.ylabel('accuracy')
plt.xlabel('epoch')
plt.legend(['train', 'val'], loc='upper left')
plt.show()

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

DeepStorm-1.1.0.tar.gz (26.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

DeepStorm-1.1.0-py3-none-any.whl (16.4 kB view details)

Uploaded Python 3

File details

Details for the file DeepStorm-1.1.0.tar.gz.

File metadata

  • Download URL: DeepStorm-1.1.0.tar.gz
  • Upload date:
  • Size: 26.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.16

File hashes

Hashes for DeepStorm-1.1.0.tar.gz
Algorithm Hash digest
SHA256 deef0d645c11e89b6cb14e6c3840d87c91e5ca2c229400322a448d1932455243
MD5 a3ac23acd1e2515094c006ab756186f9
BLAKE2b-256 b8a54bc2f186d7355fc18c9c51c432bd90378f6ee8c29c9cce8eb626e6a3d797

See more details on using hashes here.

File details

Details for the file DeepStorm-1.1.0-py3-none-any.whl.

File metadata

  • Download URL: DeepStorm-1.1.0-py3-none-any.whl
  • Upload date:
  • Size: 16.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.16

File hashes

Hashes for DeepStorm-1.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 8d57c48775e9cac298632ca1c301e8a047f322beaef1855f93d621e6c64bf228
MD5 94be8371332b217007b66f28394ec327
BLAKE2b-256 0398a614ce9c12d16945c0bbeb02fa1b77a297d2cbb4a5b48f6a4f05d0ef81d7

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page