mpi-sppy
Project description
Optimization under uncertainty for Pyomo models.
Documentation is available at readthedocs and a technical report is on OOL
NOTICE
There was a disruptive change on August 11, 2022 concerning how options are accessed. See the file disruptions.txt for more information. If you are a new user, this will not affect you, regardless of how you install. If you are an existing user, you should consider the disruption before updating to the latest mpi-sppy. The documentation on readthedocs probably refers to the newest version.
Status for internal tests
MPI
A recent version of MPI and a compatible version of mpi4py are needed.
Here are two methods that seem to work well for installation, at least when considering non-HPC platforms.
Install OpenMPI and mpi4py using conda.
conda install openmpi; conda install mpi4py (in that order)
If you already have an existing version of MPI, it may be better compile mpi4py against it. This can be done by installing mpi4py though pip.
pip install mpi4py
To test your installation, cd to the directory where you installed mpi-sppy (it is called mpi-sppy) and then give this command.
mpirun -n 2 python -m mpi4py mpi_one_sided_test.py
If you don’t see any error messages, you might have an MPI installation that will work well. Note that even if there is an error message, mpi-sppy may still execute and return correct results. Per the comment below, the run-times may just be unnecessarily inflated.
Citing mpi-sppy
If you find mpi-sppy useful in your work, we kindly request that you cite the following pre-print:
@misc{knueven2020parallel, title={A Parallel Hub-and-Spoke System for Large-Scale Scenario-Based Optimization Under Uncertainty}, author={Knueven, Bernard and Mildebrath, David and Muir, Christopher and Siirola, John D and Watson, Jean-Paul and Woodruff, David L}, year={2020} }
AN IMPORTANT NOTE FOR MPICH USERS ON HPC PLATFORMS
At least on some US Department of Energy (e.g., at Lawrence Livermore National Laboratory) compute clusters, users of mpi-sppy that are using an MPICH implementation of MPI may need to set the following in order for both (1) proper execution of the one-sided test referenced above and (2) rapid results when running any of the algorithms shipped with mpi-sppy:
export MPICH_ASYNC_PROGRESS=1
Without this setting, we have observed run-times increase by a factor of between 2 and 4, due to non-blocking point-to-point calls apparently being treated as blocking.
Further, without this setting and in situations with a large number of ranks (e.g., >> 10), we have observed mpi-sppy stalling once scenario instances are created.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file mpi-sppy-0.12.1.tar.gz
.
File metadata
- Download URL: mpi-sppy-0.12.1.tar.gz
- Upload date:
- Size: 246.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.11.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | c405b88f3ab69c67dd564789cf156c0cfb704a9eb74d0645f29a1058c19bed27 |
|
MD5 | 99c9245bdf7613254d4b5df88d65f0e9 |
|
BLAKE2b-256 | a5d21df67d240fb6f9b07d550fc6d8477a74dd220e17f24fce6ccac9d5316ad2 |
File details
Details for the file mpi_sppy-0.12.1-py3-none-any.whl
.
File metadata
- Download URL: mpi_sppy-0.12.1-py3-none-any.whl
- Upload date:
- Size: 310.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.11.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 0fdf71087f4c5cd9ae22f93ff9ec1feb724ae6bc8e12f9a194154b97dddc750e |
|
MD5 | af11955769dbfc194d89baef5fdc99b8 |
|
BLAKE2b-256 | a6295d104b54a67d088f0df34801dcc891c702da6dcbb0a1764708a0488f800d |