Skip to main content

Out-of-the-box framework for Echo State Networks experimentations.

Project description

esnpy

esnpy is an out-of-the-box framework to experiment around ESN and DeepESN.
Models has been implemented in pure NumPy/SciPy, so there's no need for a powerful GPU, or some esoteric requirements.

Right now, the focus is on batch training, and I haven't take into account the feedback loop.
But feel free to open a ticket a discuss about anything you need, or features you want !

The documentation is coming soon.

Getting Started

Installation

From Pypi

pip install esnpy

From source

pip install git+https://github.com/NiziL/esnpy#egg=esnpy

Use github.com/NiziL/esnpy@<tag or branch>#egg=esnpy to install from a specific branch or tag instead of main.

Code Examples

Don't want to read anything except code ? Take a look at the examples/ folder for a quickstart !

  • MackeyGlass/ demonstrates how to learn to predict a chaotic time series
  • TrajectoryClassification/ demonstrates how to learn to classify 2D trajectories

Quickstart

You can create your ESN with esnpy.ESN. The constructor need a esnpy.ReservoirConfig and an implementation of esnpy.train.Trainer.
Then, simply call fit function passing some warm up and training data with the related targets.
Once trained, do predictions using transform.

import esnpy

config = createConfig()
trainer = createTrainer()
warmup, data, target = loadData()

esnpy.ESN(config, trainer)
esnpy.fit(warmup, data, target)
predictions = esnpy.transform(data)

print(f"error: {compute_err(target, predictions)}")

ReservoirConfig parameters

Parameters Type Info
input_size int Size of input vectors
size int Number of units in the reservoir
leaky float Leaky parameter of the reservoir
fn Callable Activation function of the reservoir
input_bias bool Enable the usage of a bias in the input
input_init esnpy.init.Initializer Define how to initialize the input weights
input_tuners list[esnpy.tune.Tuner] Define how to tune the input weights
intern_init esnpy.init.Initializer Define how to intialize the internal weights
intern_tuners list[esnpy.init.Tuner] Define how to tune the internal weights

esnpy.init.Initializer and esnpy.tune.Tuner

esnpy.init.Initializer and esnpy.tune.Tuner are the abstract base classes used to setup the input and internal weights of a reservoir.

Initializer is defined by a init() -> Matrix function. esnpy provides implementations of initializer for both uniform and gaussian distribution of weights, and for both dense and sparse matrix.

Tuner is defined by a init(matrix : Matrix) -> Matrix function, which can be used to modify the weights after initialization. For example, esnpy provides a SpectralRadiusTuner to change the spectral radius of a weights matrix.

Trainer

esnpy.train.Trainer is responsible to create the output weights matrix from the training data and targets.

It is defined by a train(data: Matrix, target: Matrix) -> Matrix function. Beware, the data parameter here is not the input data but the reservoir states recorded during the training.

esnpy provides a RidgeTrainer to compute the output weights using a ridge regression.

Tips & Tricks

  • Sparse matrix are usually way faster than dense matrix
  • If you want to also use the input vector to compute the output (as in original paper), you'll have to use a esnpy.DeepESN with a None as the first element of the reservoir config list. It will create a simple identity function as the first layer, and so allow a Trainer to get access to these data.
  • Use numpy.random.seed(seed) before creating a each ESN if you want to compare two indentical reservoir.

Features & Roadmap

  • "core" features
    • ESN (one reservoir)
    • DeepESN (stacked reservoir)
    • Initializer: random or normal distribution, dense or sparse matrix
    • Tuner: spectral radius setter
    • Trainer: basic ridge regression
  • "nice to have" features
  • "maybe later" features
    • better handling of feeback loop
    • online training (have to find papers about it)

Bibliography

Based on:

  • The "echo state" approach to analysing and training recurrent neural networks by Herbert Jaeger (pdf),
  • A pratical guide to applying Echo State Networks by Mantas Lukoševičius (pdf),
  • Design of deep echo state networks by Claudio Gallicchio and al (link),
  • Deep echo state network (DeepESN): A brief survey by Claudio Gallicchio and Alessio Micheli (pdf).

Special thanks to Mantas Lukoševičius for his minimal ESN example, which greatly helped me to get started with reservoir computing.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

esnpy-0.1.0.tar.gz (9.6 kB view details)

Uploaded Source

Built Distribution

esnpy-0.1.0-py3-none-any.whl (8.8 kB view details)

Uploaded Python 3

File details

Details for the file esnpy-0.1.0.tar.gz.

File metadata

  • Download URL: esnpy-0.1.0.tar.gz
  • Upload date:
  • Size: 9.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.11.0

File hashes

Hashes for esnpy-0.1.0.tar.gz
Algorithm Hash digest
SHA256 c20f3a2a57dec33db6f28bc7638d0ec007f1ef8b02a8b1a25912c7e2d538d98b
MD5 2900e36518ef931e9273b19d3edde869
BLAKE2b-256 4433ce478c95250370542ed1fb1f677bf884dac3848fba9e1c8c463cae51e02a

See more details on using hashes here.

File details

Details for the file esnpy-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: esnpy-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 8.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.11.0

File hashes

Hashes for esnpy-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 abf5e9b34a01a174e4623973ae56640481bc938e1c51d8319cd77802ccf114c3
MD5 ac1b829c7bdf84558667ea0c59f34d3d
BLAKE2b-256 0d7bf3f1192cc26e9b57757214034fc84d54d755e172f3cc0530040de513288e

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page