Skip to main content

A suite of the generalization-improvement techniques Stroke, Pruning, and NeuroPlast

Project description

KeraStroke

KeraStroke is a Python package that implements "Post-Back-propagation Weight Operations", or "PBWOs"; generalization-improvement techniques for Keras models in the form of custom Keras Callbacks. These techniques function similarly but have different philosophies and results. The techniques are:

  • Stroke: Re-initializaing random weight/bias values.
  • Pruning: Reducing model size by setting weight/bias values that are close to 0, to 0.
  • NeuroPlast: Re-initializing any weight/bias values that are 0 or close to 0.

Stroke is modeled after seizures, which send random electrical signals throughout the brain, sometimes causing damage to synapses.

NeuroPlast is modeled after the concept of neuroplasticity, when neurons that no longer have a primary function begin to rewire to improve another function. I started working on NeuroPlast after I read the work done by Blakemore and Cooper on horizontal/vertical line receptor neurons in the brains of cats.

If you'd like to see the tests I'm performing with KeraStroke, you can view my testing repository here.

KeraStroke 2.0.0 marks when I really started putting work into the project. I've made an effort to comment more, clean my code up, and make the package easier to understand overall without sacrificing utility.

Limitations

KeraStroke is still in the development phase. Heavy testing has been done on Dense nets, but little testing has been done on CNNs and no testing has been done on RNNs. As of 2.1.0, CNNs are functioning properly in KeraStroke! The issue with previous versions had to do with the way the callback would retrieve the weights from the models. The callbacks perform significantly better on DenseNets, but could still find use in CNNs. I'm working on this, but will definitely need the help. Please see the github page or contact me to contribute to the project.

Stroke

The goal of the Stroke callback is to re-initialize weights/biases that have begun to contribute to overfitting.

Parameters:

  • set_value: re-initialized weights will be set to this value, rather than a random one
  • low_bound: low bound for weight re-initialization
  • high_bound: high bound for weight re-initialization
  • volatility_ratio: percentage of weights to be re-initialized
  • cutoff: number of epochs to perform PBWOs
  • decay: Every epoch, v_ratio is multiplied by this number. decay can be greater than 1.0, but v_ratio will never exceed 1.0
  • do_weights: perform stroke on weights
  • do_biases: perform stroke on biases

Pruning

The goal of the Pruning callback is to nullify weights/biases that are effectively 0.

Parameters:

  • set_value: The value that pruned weights will be set to
  • min_value: The lowest value a weight/bias can be to be oeprated on
  • max_value: The highest value a weight/bias can be to be operated on
  • cutoff: number of epochs to perform PBWOs
  • do_weights: perform pruning on weights
  • do_biases: perform pruning on biases

NeuroPlast

The goal of the NeuroPlast callback is to randomly re-initialize weights/biases that are effectively 0.

Parameters:

  • set_value: re-initialized weights will be set to this value, rather than a random one
  • min_value: lowest value a weight/bias can be to be operated on
  • max_value: highest value a weight/bias can be to be operated on
  • low_bound: low bound for weight re-initialization
  • high_bound: high bound for weight re-initialization
  • cutoff: number of epochs to perform PBWOs
  • do_weights: perform neuroplast on weights
  • do_biases: perform neuroplast on biases

Usage

KeraStroke Callbacks can be used like any other custom callback. Here's a basic example:

from kerastroke import Stroke
model.fit(X, y, 
          epochs=32, 
          callbacks=[Stroke()])

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

kerastroke-2.1.1.tar.gz (120.3 kB view details)

Uploaded Source

Built Distribution

kerastroke-2.1.1-py3-none-any.whl (7.0 kB view details)

Uploaded Python 3

File details

Details for the file kerastroke-2.1.1.tar.gz.

File metadata

  • Download URL: kerastroke-2.1.1.tar.gz
  • Upload date:
  • Size: 120.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.11.1 setuptools/45.1.0 requests-toolbelt/0.9.1 tqdm/4.31.0 CPython/3.7.7

File hashes

Hashes for kerastroke-2.1.1.tar.gz
Algorithm Hash digest
SHA256 3eb25f3d698de7ae85b68a143fd62e323d455e29f595b413128822c33ef8620e
MD5 903163030ae71681980e636780a8feab
BLAKE2b-256 3c4ed74e798142e43eb3ec7129cce6704b7edaaded710e0dea8f1fb79e9d7175

See more details on using hashes here.

File details

Details for the file kerastroke-2.1.1-py3-none-any.whl.

File metadata

  • Download URL: kerastroke-2.1.1-py3-none-any.whl
  • Upload date:
  • Size: 7.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.11.1 setuptools/45.1.0 requests-toolbelt/0.9.1 tqdm/4.31.0 CPython/3.7.7

File hashes

Hashes for kerastroke-2.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 301f739af54206a9d6dc87c17b43a682cced8840f607b1c95254971e67e79181
MD5 e57da9b63d2f3d7dfc90a0bd5baebbad
BLAKE2b-256 4f4f7672003c12b4a6145aa7b4d0e6559ad4b27c02a6301b6254311a5a5026a4

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page