Skip to main content

Learn-as-you-go emulator with error estimation

Project description

Python implementation of the Learn As You Go algorithm published in http://arxiv.org/abs/arXiv:1506.01079. TODO: reference ICLR paper

https://github.com/auckland-cosmo/LearnAsYouGoEmulator/workflows/pytest/badge.svg https://github.com/auckland-cosmo/LearnAsYouGoEmulator/workflows/doc/badge.svg https://github.com/auckland-cosmo/LearnAsYouGoEmulator/workflows/lints/badge.svg

The package defines a decorator that can be applied to functions to convert them to functions which learn outputs as they go and emulate the true function when expected errors are low. Two emulators are included: the k-nearest neighbors Monte Carlo accelerator described there, and a simple neural network.

The basic usage of the emulator code is something like this:

@emulate(CholeskyNnEmulator)
def loglike(x):
    """
    Your complex and expensive function here
    """
    return -np.dot(x,x)

This decorates the function loglike so that it is an instance of the Learner class. It can be used similarly to the original function: just call it as loglike(x). The __call__(x) method hides some extra complexity: it uses the Learn As You Go emulation scheme.

It learns both the output of loglike(x) and the difference between the emulator and the true value of loglike(x) so that we can make a prediction for the error residuals. We then put a cutoff on the amount of error that one will allow for any local evaluation of the target function. Any call to the emulator that has a too-large error will be discarded and the actual function loglike(x) defined above will be evaluated instead.

The logic for generating training sets and returning a value from either the true function or the emulated function are contained in the Learner class. The Learner class relies on an emulator class to do the emulation.

You can define you own emulator. Define a class that inherits from BaseEmulator and define two methods on it: set_emul_func(self, x_train: np.ndarray, y_train: np.ndarray) and set_emul_error_func(self, x_train: np.ndarray, y_train: np.ndarray) that set functions for, respectively, self.emul_func and self.emul_error_func. An example of this definition is provided.

Installation

There are a small number of python dependencies. If you use anaconda you can create an appropriate environment and install to your python path by running

conda env create --file environment.yml
pip install -e .

from this directory.

The pytorch dependency is only needed if you are using the neural network emulator or running the associated tests.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

layg-0.0.1.dev0.tar.gz (11.7 kB view details)

Uploaded Source

Built Distribution

layg-0.0.1.dev0-py3-none-any.whl (16.6 kB view details)

Uploaded Python 3

File details

Details for the file layg-0.0.1.dev0.tar.gz.

File metadata

  • Download URL: layg-0.0.1.dev0.tar.gz
  • Upload date:
  • Size: 11.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/46.1.3.post20200330 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.8.2

File hashes

Hashes for layg-0.0.1.dev0.tar.gz
Algorithm Hash digest
SHA256 8f46b2307fe0b060693114d06778c515ee7a79b9723b848d294acc4c1295ec05
MD5 745243d99080d7d308072143d226a3bd
BLAKE2b-256 1468dab66201ad41c8f2789fb9b9b3e2b77e79eaf1304478f0786b9ea1b70c5a

See more details on using hashes here.

Provenance

File details

Details for the file layg-0.0.1.dev0-py3-none-any.whl.

File metadata

  • Download URL: layg-0.0.1.dev0-py3-none-any.whl
  • Upload date:
  • Size: 16.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/46.1.3.post20200330 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.8.2

File hashes

Hashes for layg-0.0.1.dev0-py3-none-any.whl
Algorithm Hash digest
SHA256 c1d377c8938c4a94ea5521aa91413e0132c675260a966654ad2b51dbbce3aad4
MD5 5b68f7f47baba4cd6d343e86eb22f1fb
BLAKE2b-256 e97197fb7b3bcc31013db18c999e66eebdc877d3cb253599b2eb986a448bb937

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page