MPI volume decomposition and particle distribution tools
Project description
A python module for MPI volume decomposition and particle distribution
Free software: MIT license
Documentation: https://argonnecpac.github.io/MPIPartition
Repository: https://github.com/ArgonneCPAC/MPIPartition
Features
Cartesian partitioning of a cubic volume (arbitrary dimensions) among MPI ranks
Equal area decomposition of the spherical shell (S2) among MPI ranks
distributing particle-data among ranks to the corresponding subvolume / surface segment
overloading particle-data at rank boundaries (“ghost particles”)
Installation
Installing from the PyPI repository:
pip install mpipartition
Installing the development version from the GIT repository
git clone https://github.com/ArgonneCPAC/mpipartition.git
cd mpipartition
python setup.py develop
Requirements
These packages will be automatically installed if they are not already present:
Basic Usage
Check the documentation for an in-depth explanation / documentation.
# this code goes into mpipartition_example.py
from mpipartition import Partition, distribute, overload
import numpy as np
# create a partition of the unit cube with available MPI ranks
box_size = 1.
partition = Partition()
if partition.rank == 0:
print(f"Number of ranks: {partition.nranks}")
print(f"Volume decomposition: {partition.decomposition}")
# create random data
nparticles_local = 1000
data = {
"x": np.random.uniform(0, 1, nparticles_local),
"y": np.random.uniform(0, 1, nparticles_local),
"z": np.random.uniform(0, 1, nparticles_local)
}
# distribute data to ranks assigned to corresponding subvolume
data = distribute(partition, box_size, data, ('x', 'y', 'z'))
# overload "edge" of each subvolume by 0.05
data = overload(partition, box_size, data, 0.05, ('x', 'y', 'z'))
This code can then be executed with mpi:
mpirun -n 10 python mpipartition_example.py
A more applied example, using halo catalogs from a HACC cosmological simulation (in the GenericIO data format):
from mpipartition import Partition, distribute, overload
import numpy as np
import pygio
# create a partition with available MPI ranks
box_size = 64. # box size in Mpc/h
partition = Partition(3) # by default, the dimension is 3
# read GenericIO data in parallel
data = pygio.read_genericio("m000p-499.haloproperties")
# distribute
data = distribute(partition, box_size, data, [f"fof_halo_center_{x}" for x in "xyz"])
# mark "owned" data with rank (allows differentiating owned and overloaded data)
data["status"] = partition.rank * np.ones(len(data["fof_halo_center_x"]), dtype=np.uint16)
# overload by 4Mpc/h
data = overload(partition, box_size, data, 4., [f"fof_halo_center_{x}" for x in "xyz"])
# now we can do analysis such as 2pt correlation functions (up to 4Mpc/h)
# or neighbor finding, etc.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file mpipartition-1.4.0.tar.gz
.
File metadata
- Download URL: mpipartition-1.4.0.tar.gz
- Upload date:
- Size: 18.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.4 CPython/3.8.18 Linux/6.5.0-1025-azure
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | c6e5be3be9fbebbad3aa1e6ae091017c9b11b6f5ede4b409ea0e8414287f390c |
|
MD5 | bda89496c29858f690db8e3f5a03998e |
|
BLAKE2b-256 | 3db76e3dacaadc1e682c51e894a66146470f42db7cd2ce39c580f9f8b0dc9f5a |
File details
Details for the file mpipartition-1.4.0-py3-none-any.whl
.
File metadata
- Download URL: mpipartition-1.4.0-py3-none-any.whl
- Upload date:
- Size: 23.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.4 CPython/3.8.18 Linux/6.5.0-1025-azure
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | b3c0078d365950cffa347c15d419da6f9dcdbe34c9494b7b1e975aee5ab364b1 |
|
MD5 | 21aedb3d2dd92e9fd7625e9eb72d9a06 |
|
BLAKE2b-256 | 33636251de534b0a10d386fdb4d04745518fc4b2f0fd12009bc7c1103534d2fc |