Skip to main content

the galilei project.

Project description

galilei

Release Status CI Status CI Status Documentation Status License: MIT

galilei is a software package that makes emulating a function easier. The motivation of emulating a function is that sometimes computing a function could be a time consuming task, so one may need to find fast approximations of a function that's better than basic interpolation techniques. It builds on the ideas of cosmopower and axionEmu, with an aim to be as generic and flexible as possible on the emulating target. As such, galilei can take any generic parametrized function that returns an array without a need to know its implementation detail.

Features

  • Flexible: Able to emulate generic numerical functions that takes float parameters and produces a float array.
  • Easy to use: just add a decorator @emulate and use your emulated function as a drop-in replacement of your existing function
  • Allow arbitrary transformation of function output before training through the use of Preconditioner.
  • Support multiple backends: torch, sklearn, GPy (for Gaussian Process Regression)

Installation

pip install galilei

Basic usage

Suppose that we have an expensive function that we want to emulate

def test(a=1, b=1):
    x = np.linspace(0, 10, 100)
    return np.sin(a*x) + np.sin(b*x)

If you want to emulate this function, you can simply add a decorator @emulate and supply the parameters that you want to evaluate this function at to build up the training data set.

from galilei import emulate

@emulate(samples={
    'a': np.random.rand(1000),
    'b': np.random.rand(1000)
})
def test(a=1, b=1):
    x = np.linspace(0, 10, 100)
    return np.sin(a*x) + np.sin(b*x)

Here we are just making 1000 pairs of random numbers from 0 to 1 to train our function. When executing these lines, the emulator will start training, and once it is done, the original test function will be automatically replaced with the emulated version and should behave in the same way, except much faster!

Training emulator...
100%|██████████| 500/500 [00:09<00:00, 50.50it/s, loss=0.023]
Ave Test loss: 0.025

Comparison

You can also easily save your trained model with the save option

@emulate(samples={
    'a': np.random.rand(100),
    'b': np.random.rand(100)
}, backend='sklearn', save="test.pkl")
def test(a=1, b=1):
    x = np.linspace(0, 10, 100)
    return np.sin(a*x) + np.sin(b*x)

and when you use it in production, simply load a pretrained model with

@emulate(backend='sklearn', load="test.pkl")
def test(a=1, b=1):
    x = np.linspace(0, 10, 100)
    return np.sin(a*x) + np.sin(b*x)

and your function will be replaced with a fast emulated version.

For more detailed usage examples, see this notebook: open in colab

Roadmap

  • TODO support saving trained model and load pretrained model
  • TODO add prebuild preconditioners

Credits

This package was created with the ppw tool. For more information, please visit the project page.

Free software: MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

galilei-0.1.6.tar.gz (12.1 kB view hashes)

Uploaded Source

Built Distribution

galilei-0.1.6-py3-none-any.whl (9.7 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page