A toolbox for efficient global optimization
Project description
egobox
Rust toolbox for Efficient Global Optimization algorithms inspired from SMT.
egobox
consists of the following sub-packages.
Name | Version | Documentation | Description |
---|---|---|---|
doe | sampling methods; contains LHS, FullFactorial, Random methods | ||
gp | gaussian process regression; contains Kriging and PLS dimension reduction | ||
moe | mixture of experts using GP models | ||
ego | efficient global optimization with basic constraints and mixed integer handling |
Usage
Depending on the sub-packages you want to use, you have to add following declarations to your Cargo.toml
[dependencies]
egobox-doe = { version = "0.3.0" }
egobox-gp = { version = "0.3.0" }
egobox-moe = { version = "0.3.0" }
egobox-ego = { version = "0.3.0" }
Features
linfa BLAS/Lapack backend feature
relies on linfa
BLAS/Lapack backend features.
End user project using gp
, moe
and ego
should select a BLAS/Lapack backend
depending its environment; it can be either:
- Openblas:
linfa/openblas-system
orlinfa/openblas-static
- Netlib:
linfa/netlib-system
orlinfa/netlib-static
- Intel MKL:
linfa/intel-mkl-system
orlinfa/intel-mkl-static
where
*-system
features: try to find the corresponding backend in your installation.*-static
features: try to download and compile the corresponing backend.
More information in linfa features
For instance, using gp
with the Intel MKL BLAS/Lapack backend, you have to specify the linfa backend feature :
[dependencies]
egobox-gp = { version = "0.3.0", features = ["linfa/intel-mkl-static"] }
Note: only end-user projects should specify a provider in Cargo.toml
(not librairies). In case of library development, the backend is specified on the command line as for examples below.
serializable-gp
The serializable-gp
feature enables the serialization of GP models using the serde crate.
persistent-moe
The persistent-moe
feature enables save()
and load()
methods for MoE model to/from a json file using the serde crate.
Examples
Examples (in examples/
sub-packages folder) are run as follows:
$ cd doe && cargo run --example samplings --release
Using the Intel MKL BLAS/Lapack backend, you can run :
$ cd gp && cargo run --example kriging --release --features linfa/intel-mkl-static
$ cd moe && cargo run --example clustering --release --features linfa/intel-mkl-static
$ cd ego && cargo run --example ackley --release --features linfa/intel-mkl-static
Thanks to the PyO3 project, which makes Rust well suited for building Python extensions, the EGO algorithm written in Rust (aka Egor
) is binded in Python. You can install the Python package using:
$ pip install egobox
See the tutorial notebook for usage of the optimizer.
Why egobox?
I started this library as a way to learn Rust and see if it can be used to implement algorithms like those in the SMT toolbox[^1]. As the first components (doe, gp) emerged, it appears I could translate Python code almost line by line in Rust (well... after a great deal of borrow-checker fight!) and thanks to Rust ndarray library ecosystem.
This library relies also on the linfa project which aims at being the "scikit-learn-like ML library for Rust". Along the way I could contribute to linfa
by porting gaussian mixture model (linfa-clustering/gmm
) and partial least square family methods (linfa-pls
) confirming the fact that Python algorithms translation in Rust could be pretty straightforward.
While I did not benchmark exactly my Rust code against SMT Python one, from my debugging sessions, I noticed I did not get such a great speed up. Actually, algorithms like doe
and gp
relies extensively on linear algebra and Python famous libraries numpy
/scipy
which are strongly optimized by calling C or Fortran compiled code.
My guess at this point is that interest could come from some Rust algorithms built upon these initial building blocks hence I started to implement mixture of experts algorithm (moe
) and on top surrogate-based optimization EGO algorithm (ego
) which gives its name to the library[^2][^3]. Aside from performance, such library can also take advantage from the others Rust selling points, namely reliability and productivity.
Cite
If you happen to find this Rust library useful for your research, you can cite this project as follows:
@Misc{egobox,
author = {Rémi Lafage},
title = {Egobox: efficient global optimization toolbox in Rust},
year = {2020--},
url = "https://github.com/relf/egobox"
}
[^1]: M. A. Bouhlel and J. T. Hwang and N. Bartoli and R. Lafage and J. Morlier and J. R. R. A. Martins. A Python surrogate modeling framework with derivatives. Advances in Engineering Software, 2019.
[^2]: Bartoli, Nathalie, et al. "Adaptive modeling strategy for constrained global optimization with application to aerodynamic wing design." Aerospace Science and technology 90 (2019): 85-102.
[^3]: Dubreuil, Sylvain, et al. "Towards an efficient global multidisciplinary design optimization algorithm." Structural and Multidisciplinary Optimization 62.4 (2020): 1739-1765.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distributions
Hashes for egobox-0.3.1-pp37-pypy37_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | eac5e21746c47598d4523d7cd057be53375a227f5b6c48f166c763ced272c961 |
|
MD5 | fb2312f54b7d57c8a5a5bf2ae8e51565 |
|
BLAKE2b-256 | eea7fb7589a1429c7cdd74906b0b5889c0d1e1dab8a0b4e2a21b51e97c39ac4e |
Hashes for egobox-0.3.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | d4ff02f5098a48afe5d943013e93f9c2675834d1792f7e34ce7db970f17b1b02 |
|
MD5 | bec86e5c3d305e725a888b584bfda6c1 |
|
BLAKE2b-256 | be36ea99b7c60b4a4aca36e4afbc070568c82b3f1b09fd0ea74ce6abbf45e926 |
Hashes for egobox-0.3.1-cp39-none-win_amd64.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | c40577a67bafc2f399382642950510bb9158766aeb29a56a9271741e5fd83e83 |
|
MD5 | 3c94fc21cb10d4090dd3785c8317b37f |
|
BLAKE2b-256 | f81d4ef4097f49d48e4ab57312aad25def68b32c58d8121da50d1a7b5cb5cb36 |
Hashes for egobox-0.3.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 7e136febe611c0159238a2d3e66ad8c91d297ef2a28d57a179fa49c55e681c01 |
|
MD5 | d98f33651627f81994d3dbd2de93de73 |
|
BLAKE2b-256 | 2e847625b5b420d70a97ed3d92ab1489910cbd7a62e57b3bb981ad3007e337ee |
Hashes for egobox-0.3.1-cp38-none-win_amd64.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 7118c25a0612e268740b5946dc405cc98a77ea3cd4781b9468ea6a3539db6357 |
|
MD5 | 9a49b82a419966dc738322b2c6d8a2b1 |
|
BLAKE2b-256 | 44a479d42ce9f940a2dc0cb17a34f2d30112924e1e6dd68c64de7a0a46006116 |
Hashes for egobox-0.3.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | c6d319324d76882c916903caf38a350162385f3876f80fbc8e970b670f6d4242 |
|
MD5 | e51e3c546663b75e8fcb2c926325307f |
|
BLAKE2b-256 | e90d57de7defbcecb15bb14407a1421c627d220e10377a49f21d7123798117cc |
Hashes for egobox-0.3.1-cp37-none-win_amd64.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 982b7ebd5c1e43bf64c2d60500f3a6b2235db5224463fbae6471d74bafebe243 |
|
MD5 | a2d5a70bd48da98bd52963077819a62b |
|
BLAKE2b-256 | 87d00d8aee56d0603db31e27e672c4f28797a86af114eb1ed7969623f64ba001 |
Hashes for egobox-0.3.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 7d03380509e68db358cc398fe94414fb1a7d18438f694995b22291f5156bc941 |
|
MD5 | fd1e80e30d49e0855611aa0044130ce8 |
|
BLAKE2b-256 | 61a62e5a4c7ade9e0425ea1b90a1f1848fd52dd0607f210ef1269c68445b3a2f |
Hashes for egobox-0.3.1-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | c158f5871c1213994ce6d8a58aa140068c677edfbb06242eaefbdd67eb958571 |
|
MD5 | 562c4feb8b9895b5b8c9e78bd3777122 |
|
BLAKE2b-256 | de45a8cc7c018c6a65fc69665e3fb236cb228242d72fd4fe2eacf83d3b9e831c |