Skip to main content

Optimization at scale, powered by libEnsemble

Project description

PyPI Conda Version tests badge Documentation Status DOI License


optimas logo

Optimization at scale, powered by libEnsemble

Explore the docs »

View Examples · Support · API Reference

Optimas is a Python library designed for highly scalable optimization, from laptops to massively-parallel supercomputers.

Key Features

  • Scalability: Leveraging the power of libEnsemble, Optimas is designed to scale seamlessly from your laptop to high-performance computing clusters.
  • User-Friendly: Optimas simplifies the process of running large parallel parameter scans and optimizations. Specify the number of parallel evaluations and the computing resources to allocate to each of them and Optimas will handle the rest.
  • Advanced Optimization: Optimas integrates algorithms from the Ax library, offering both single- and multi-objective Bayesian optimization. This includes advanced techniques such as multi-fidelity and multi-task algorithms.

Installation

You can install Optimas from PyPI (recommended):

pip install optimas

from conda-forge:

conda install optimas --channel conda-forge

or directly from GitHub:

pip install git+https://github.com/optimas-org/optimas.git

Make sure mpi4py is available in your environment before installing optimas. Fore more details, check out the full installation guide. We have also prepared dedicated installation instructions for some HPC systems such as JUWELS (JSC), Maxwell (DESY) and Perlmutter (NERSC).

Documentation

For more information on how to use Optimas, check out the documentation. You'll find installation instructions, a user guide, examples and the API reference.

Support

Need more help? Join our Slack channel or open a new issue.

Citing optimas

If your usage of Optimas leads to a scientific publication, please consider citing the original paper:

@article{PhysRevAccelBeams.26.084601,
    title     = {Bayesian optimization of laser-plasma accelerators assisted by reduced physical models},
    author    = {Ferran Pousa, A. and Jalas, S. and Kirchen, M. and Martinez de la Ossa, A. and Th\'evenet, M. and Hudson, S. and Larson, J. and Huebl, A. and Vay, J.-L. and Lehe, R.},
    journal   = {Phys. Rev. Accel. Beams},
    volume    = {26},
    issue     = {8},
    pages     = {084601},
    numpages  = {9},
    year      = {2023},
    month     = {Aug},
    publisher = {American Physical Society},
    doi       = {10.1103/PhysRevAccelBeams.26.084601},
    url       = {https://link.aps.org/doi/10.1103/PhysRevAccelBeams.26.084601}
}

and libEnsemble:

@article{Hudson2022,
    title   = {{libEnsemble}: A Library to Coordinate the Concurrent
                Evaluation of Dynamic Ensembles of Calculations},
    author  = {Stephen Hudson and Jeffrey Larson and John-Luke Navarro and Stefan M. Wild},
    journal = {{IEEE} Transactions on Parallel and Distributed Systems},
    volume  = {33},
    number  = {4},
    pages   = {977--988},
    year    = {2022},
    doi     = {10.1109/tpds.2021.3082815}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

optimas-0.5.0.tar.gz (67.5 kB view hashes)

Uploaded Source

Built Distribution

optimas-0.5.0-py3-none-any.whl (74.3 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page