Lightweight Covariance Matrix Adaptation Evolution Strategy (CMA-ES) implementation for Python 3.
Project description
CMA-ES
Lightweight Covariance Matrix Adaptation Evolution Strategy (CMA-ES)[1] implementation.
Himmelblau function.
Rosenbrock function.
Quadratic function.
These GIF animations are generated by visualizer.py.
Installation
Supported Python versions are 3.6 or later.
$ pip install cmaes
Usage
This library provides two interfaces that an Optuna's sampler interface and a low-level interface. I strongly recommend you to use this library via Optuna.
Optuna's sampler interface
Optuna [2] is an automatic hyperparameter optimization framework. Optuna officially implements a sampler based on pycma. It achieves almost the same performance. But this library is faster and simple.
import optuna
from cmaes.sampler import CMASampler
def objective(trial: optuna.Trial):
x1 = trial.suggest_uniform("x1", -4, 4)
x2 = trial.suggest_uniform("x2", -4, 4)
return (x1 - 3) ** 2 + (10 * (x2 + 2)) ** 2
def main():
sampler = CMASampler()
study = optuna.create_study(sampler=sampler)
study.optimize(objective, n_trials=250)
if __name__ == "__main__":
main()
Note that CMASampler doesn't support categorical distributions. Although pycma's sampler supports categorical distributions, it also has a problem (especially on high-cardinality categorical distribution). If your search space contains a categorical distribution, please use TPESampler.
Low-level interface
import numpy as np
from cmaes.cma import CMA
def quadratic(x1: float, x2: float):
return (x1 - 3) ** 2 + (10 * (x2 + 2)) ** 2
def main():
cma_es = CMA(mean=np.zeros(2), sigma=1.3)
best_value = float("inf")
best_param = None
for generation in range(50):
solutions = []
for _ in range(cma_es.population_size):
z, x = cma_es.ask()
evaluation = quadratic(x[0], x[1])
if evaluation < best_value:
best_value = evaluation
best_param = x
solutions.append((z, evaluation))
cma_es.tell(solutions)
print(f"#{generation}: {best_value} (x1={best_param[0]}, x2 = {best_param[1]})")
print(f"RESULT: {best_value} (x1={best_param[0]}, x2 = {best_param[1]})")
if __name__ == "__main__":
main()
Benchmark results
Rosenbrock function | Six-Hemp Camel function |
---|---|
This implementation (green) stands comparison with pycma (blue). See benchmark for details.
Links
Other libraries:
I respect all libraries involved in CMA-ES.
- pycma : Most famous CMA-ES implementation by Nikolaus Hansen.
- cma-es : A Tensorflow v2 implementation.
References:
- [1] N. Hansen, The CMA Evolution Strategy: A Tutorial. arXiv:1604.00772, 2016.
- [2] Takuya Akiba, Shotaro Sano, Toshihiko Yanase, Takeru Ohta, Masanori Koyama. 2019. Optuna: A Next-generation Hyperparameter Optimization Framework. In The 25th ACM SIGKDD Conference on Knowledge Discovery and Data Mining (KDD ’19), August 4–8, 2019.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.