Skip to main content

Out-of-the-box framework for Echo State Networks

Project description

esnpy

esnpy is an out-of-the-box framework to experiment around ESN and DeepESN.
Models have been implemented in pure NumPy/SciPy, so there is no need for a powerful GPU, or any esoteric requirements.

Right now, the focus is on batch training, and feedback loops have not been taken into account.
But feel free to open a ticket a discuss about anything you need, features you want, or even help !

Note from the author: esnpy is a small projet I initiated during my master intership, and have recently cleaned up. I might keep working on it for fun, but If you want/need a more robust framework, ReservoirPy might be the one you're searching for ;)

Getting Started

Installation

From PyPI

pip install esnpy

From source

pip install git+https://github.com/NiziL/esnpy#egg=esnpy

Use github.com/NiziL/esnpy@<tag or branch>#egg=esnpy to install from a specific branch or tag instead of main.

Quickstart

import esnpy

reservoir_builder = createBuilder()
trainer = createTrainer()
warmup, data, target = loadData()

# create the echo state network
esn = esnpy.ESN(reservoir_builder.build(), trainer)
# train it
esn.fit(warmup, data, target)
# test it
predictions = esnpy.transform(data)
print(f"error: {compute_err(target, predictions)}")

ESN and DeepESN

You can create your ESN with esnpy.ESN. The constructor needs a esnpy.Reservoir and an implementation of esnpy.train.Trainer.

esnpy.DeepESN doesn't differ a lot, it just expect a list of Reservoir and have an optional parameters mask to specify from which reservoirs the Trainer should learn. The size of mask and reservoirs must be the same.

Then, simply call fit function by passing some warm up and training data with the related targets.
Once trained, run predictions using transform.

Reservoir and ReservoirBuilder

A Reservoir can easily be initialized using the ReservoirBuilder dataclass.
For convenience, the configuration class is also a builder, exposing a build() method. This method has an optional seed parameter used to make deterministic initialization, and so to ease the comparaison of two identical reservoirs.

Parameters Type Description Default
input_size int Size of input vectors
size int Number of units in the reservoir
leaky float Leaky parameter of the reservoir
fn Callable Activation function of the reservoir np.tanh
input_bias bool Enable the usage of a bias in the input True
input_init esnpy.init.Initializer Define how to initialize the input weights
input_tuners list[esnpy.tune.Tuner] Define how to tune the input weights []
intern_init esnpy.init.Initializer Define how to intialize the internal weights
intern_tuners list[esnpy.init.Tuner] Define how to tune the internal weights []

Initializer and Tuner

esnpy.init.Initializer and esnpy.tune.Tuner are the abstract base classes used to setup the input and internal weights of a reservoir.

Initializer is defined by a init() -> Matrix function. esnpy provides implementations of initializer for both uniform and gaussian distribution of weights, and for both dense and sparse matrix.

Tuner is defined by a init(matrix : Matrix) -> Matrix function, which can be used to modify the weights after initialization. For example, esnpy provides a SpectralRadiusTuner to change the spectral radius of a weights matrix.

Trainer

esnpy.train.Trainer is responsible to create the output weights matrix from the training data and targets.
It is defined by a train(inputs: Matrix, data: Matrix, target: Matrix) -> Matrix function.

esnpy provides a RidgeTrainer to compute the output weights using a ridge regression. This trainer has three parameters : one float, the regularization parameter's weight alpha, and two optionals boolean (default to true) use_bias and use_input to control if we should use a bias and the input to compute the readout weights.

Code Examples

Want to see some code in action ? Take a look at the examples/ directory:

  • MackeyGlass/ demonstrates how to learn to predict a time series,
  • TrajectoryClassification/ demonstrates how to learn to classify 2D trajectories.

Bibliography

Based on:

  • The "echo state" approach to analysing and training recurrent neural networks by Herbert Jaeger (pdf),
  • A pratical guide to applying Echo State Networks by Mantas Lukoševičius (pdf),
  • Design of deep echo state networks by Claudio Gallicchio and al (link),
  • Deep echo state network (DeepESN): A brief survey by Claudio Gallicchio and Alessio Micheli (pdf).

Special thanks to Mantas Lukoševičius for his minimal ESN example, which greatly helped me to get started with reservoir computing.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

esnpy-0.3.0.tar.gz (10.4 kB view details)

Uploaded Source

Built Distribution

esnpy-0.3.0-py3-none-any.whl (9.3 kB view details)

Uploaded Python 3

File details

Details for the file esnpy-0.3.0.tar.gz.

File metadata

  • Download URL: esnpy-0.3.0.tar.gz
  • Upload date:
  • Size: 10.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.11.1

File hashes

Hashes for esnpy-0.3.0.tar.gz
Algorithm Hash digest
SHA256 a21005564e1f9cfcdaf32738d22fa508bc8384c9935354d1d46f457be745adc0
MD5 806fc51010bc6a1d851ddc00d698fd63
BLAKE2b-256 14cab06459358a2302c3235fd6af217c9efeba9d186cbc2b4a9505668f7c3704

See more details on using hashes here.

File details

Details for the file esnpy-0.3.0-py3-none-any.whl.

File metadata

  • Download URL: esnpy-0.3.0-py3-none-any.whl
  • Upload date:
  • Size: 9.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.11.1

File hashes

Hashes for esnpy-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 6a68027e67c413fc9372aa1e86ae4b3f2812e616c9dd7eb5deb8128b7e3f57df
MD5 df3e5b9619488e7e5a063137cab90365
BLAKE2b-256 02b9f4e65f48f7ae682874c28a66c41b6083aa821c6383e665ccb36fa90b15d1

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page