Skip to main content

Multi-objective Bayesian optimization library for OpenFOAM

Project description

🏄‍♂️ FlowBoost — Multi-objective Bayesian optimization for OpenFOAM

Python

FlowBoost is a highly configurable and extensible library for handling and optimizing OpenFOAM CFD simulations. It provides ready bindings for state-of-the-art Bayesian optimization using Meta's Ax, powered by PyTorch, and simple interfaces for using any other optimization library.

Features

  • Easy API syntax (see examples/)
  • Ready bindings for Meta's Ax (Adaptive Experimentation Platform)
    • Multi-objective, high-dimensional Bayesian optimization
    • SAASBO, GPU acceleration
  • Fully hands-off cluster-native job management
  • Simple interfaces for OpenFOAM cases (flowboost.Case)
  • Use any optimization backend by implementing a few interfaces

Examples

The examples/ directory contains code examples for simplified real-world scenarios:

  1. aerofoilNACA0012Steady: parameter optimization for a NACA 0012 aerofoil steady-state simulation
  2. pitzDaily: backward-facing step optimization using local Docker execution and the Pandas data backend

By default, FlowBoost uses Ax's Service API as its optimization backend. In practice, any optimizer can be used, as long as it conforms to the abstract flowboost.optimizer.Backend base class, which the backend interfaces in flowboost.optimizer.interfaces implement.

OpenFOAM case abstraction

Working with OpenFOAM cases is performed through the flowboost.Case abstraction, which provides a high-level API for OpenFOAM case-data and configuration access. The Case abstraction can be used as-is outside of optimization workflows:

from flowboost import Case

# Clone tutorial to current working directory (or a specified dir)
tutorial_case = Case.from_tutorial("fluid/aerofoilNACA0012Steady")

# Dictionary read/write access
control_dict = tutorial_case.dictionary("system/controlDict")
control_dict.entry("writeInterval").set("5000")

# Access data in an evaluated case
case = Case("my/case/path")
df = case.data.simple_function_object_reader("forceCoeffsCompressible")

Installation

FlowBoost requires Python 3.10 or later.

It is highly recommended that you use a virtual environment ("venv") when using FlowBoost. For this, uv is the recommended choice, but virtualenv, Poetry, and others will work just fine, too.

To set up a virtual environment using uv:

mkdir my-research-dir
cd my-research-dir

uv sync --python=3.13 # or your desired Python version (>=3.10)
uv add flowboost # add FlowBoost to the uv-managed venv as a dependency

Next, either source the environment manually using source .venv/bin/activate, or run your script using uv run my_experiment.py.

uv (recommended)

To add FlowBoost to an existing Python environment:

uv add flowboost

pip

pip install flowboost

CPU compatibility

In order to use the standard polars package, your CPU should support AVX2 instructions (and other SIMD instructions). These are typically available in Intel Broadwell/4000-series and later, and all AMD Zen-based CPUs.

If your CPU is from 2012 or earlier, you will most likely receive an illegal instruction error. This can be solved by installing the lts-cpu extra:

uv add flowboost[lts-cpu]
# or: pip install flowboost[lts-cpu]

This installs polars-lts-cpu, which is functionally identical but not as performant.

OpenFOAM

FlowBoost uses OpenFOAM in two ways:

  1. Case setup uses CLI tools like foamDictionary and foamCloneCase on the host machine.
  2. Simulations run wherever the Manager sends them: locally (Local, DockerLocal) or on a cluster (SGE, Slurm).

The host always needs access to OpenFOAM CLI tools for case setup, even when simulations run elsewhere. On Linux, a native install works. On macOS and Windows, FlowBoost provides these tools transparently through Docker.

  • Linux: native OpenFOAM or Docker
  • macOS: Docker (OrbStack recommended, Docker Desktop also works)
  • Windows: Docker (Docker Desktop). Not tested on Windows.

On first run in Docker mode, FlowBoost builds the flowboost/openfoam:13 image from the bundled Dockerfile. This is a one-time operation. To force a specific mode, set FLOWBOOST_FOAM_MODE to native or docker. To use a custom image, set FLOWBOOST_FOAM_IMAGE.

FlowBoost uses the openfoam.org lineage (not the ESI/openfoam.com fork) and has been tested with versions 11 and 13. The bundled Dockerfile targets OpenFOAM 13 on Ubuntu 24.04. Each OpenFOAM release is tied to a specific Ubuntu LTS, so Dockerfiles are per-version by design.

Using Docker mode

In Docker mode, CLI tools like foamDictionary run inside a persistent container. This is automatic: Case, Dictionary, and other abstractions work the same way regardless of the mode (native, Docker).

When running multiple OpenFOAM commands (e.g. reading dictionaries across many cases), use the container() context manager to keep a single container alive for the entire block:

from flowboost import Case, foam_runtime

workdir = Path("flowboost_data")

with foam_runtime().container(workdir):
    for case_dir in sorted(workdir.glob("case_*")):
        case = Case(case_dir)
        k = case.dictionary("0/k").entry("boundaryField/inlet/value").value
        # All foamDictionary calls reuse the same container

Without container(), FlowBoost auto-mounts paths as needed, which may restart the container when new paths are encountered. Pre-mounting a parent directory (like the workdir above) avoids this.

To run simulations locally in Docker, use the DockerLocal manager:

from flowboost import Manager

manager = Manager.create(scheduler="dockerlocal", wdir=data_dir, job_limit=2)

Each submitted case gets its own detached container with the case directory bind-mounted. See the pitzDaily example for a complete Docker-based workflow.

GPU acceleration

If your environment has a CUDA-compatible NVIDIA GPU, verify you have a recent CUDA Toolkit release. Otherwise, GPU acceleration for PyTorch will not be available. This is especially critical if you are using SAASBO for high-dimensional optimization tasks (≥20 dimensions).

nvcc -V

# Verify CUDA availability
python3 -c "import torch; print(torch.cuda.is_available())"

Development

See CONTRIBUTING.md for setup, tooling, and testing guidance.

Acknowledgments

The base functionality for FlowBoost was created as part of a mechanical engineering master's thesis at Aalto University, funded by Wärtsilä. Wärtsilä designs and manufactures marine combustion engines and energy solutions in Vaasa, Finland.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

flowboost-0.2.6.tar.gz (66.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

flowboost-0.2.6-py3-none-any.whl (81.2 kB view details)

Uploaded Python 3

File details

Details for the file flowboost-0.2.6.tar.gz.

File metadata

  • Download URL: flowboost-0.2.6.tar.gz
  • Upload date:
  • Size: 66.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for flowboost-0.2.6.tar.gz
Algorithm Hash digest
SHA256 63fab9649861b6fae849d34466f9b80bc95a9a8d974bbe660f517777b66b0160
MD5 3177814e0f67413323dc8903cc899348
BLAKE2b-256 57fbffc336d1b145c48815eea8d53d4d741d5fa1f1d768ce4579f1d4e4ec0ac5

See more details on using hashes here.

Provenance

The following attestation bundles were made for flowboost-0.2.6.tar.gz:

Publisher: publish.yml on 499602D2/flowboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file flowboost-0.2.6-py3-none-any.whl.

File metadata

  • Download URL: flowboost-0.2.6-py3-none-any.whl
  • Upload date:
  • Size: 81.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for flowboost-0.2.6-py3-none-any.whl
Algorithm Hash digest
SHA256 7d6467468f82c5e6a6494ac7fb5b6ef2a8fa32676a6abb472d42796cbea4ebd8
MD5 774848643381012a827902219f785711
BLAKE2b-256 4342b5e99e71c981e652b4e61ce2065a9652f477095f76c9475811c9d85a7b44

See more details on using hashes here.

Provenance

The following attestation bundles were made for flowboost-0.2.6-py3-none-any.whl:

Publisher: publish.yml on 499602D2/flowboost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page