MPI volume decomposition and particle distribution tools
Reason this release was yanked:
pre-release version
Project description
A python module for MPI volume decomposition and particle distribution
Free software: MIT license
Documentation: https://argonnecpac.github.io/MPIPartition
Repository: https://github.com/ArgonneCPAC/MPIPartition
Features
Cartesian partitioning of a cubic volume among available MPI ranks
distributing particle-data among ranks to the corresponding subvolume
overloading particle-data at rank boundaries
exchaning particle-data according to a “owner”-list of keys per rank
Installation
Installing from the PyPI repository:
pip install mpipartition
Installing the development version from the GIT repository
git clone https://github.com/ArgonneCPAC/mpipartition.git
cd mpipartition
python setup.py develop
Requirements
Basic Usage
Check the documentation for an in-depth explanation / documentation.
# this code goes into mpipartition_example.py
from mpipartition import Partition, distribute, overload
import numpy as np
# create a partition of the unit cube with available MPI ranks
partition = Partition(1.)
if partition.rank == 0:
print(f"Number of ranks: {partition.nranks}")
print(f"Volume decomposition: {partition.decomposition}")
# create random data
nparticles_local = 1000
data = {
"x": np.random.uniform(0, 1, nparticles_local),
"y": np.random.uniform(0, 1, nparticles_local),
"z": np.random.uniform(0, 1, nparticles_local)
}
# distribute data to ranks assigned to corresponding subvolume
data = distribute(partition, data, ('x', 'y', 'z'))
# overload "edge" of each subvolume by 0.05
data = overload(partition, data, 0.05, ('x', 'y', 'z'))
This code can then be executed with mpi:
mpirun -n 10 python mpipartition_example.py
A more applied example, using halo catalogs from a HACC cosmological simulation (in the GenericIO data format):
from mpipartition import Partition, distribute, overload
import numpy as np
import pygio
# create a partition with available MPI ranks
box_size = 64. # box size in Mpc/h
partition = Partition(box_size)
# read GenericIO data in parallel
data = pygio.read_genericio("m000p-499.haloproperties")
# distribute
data = distribute(partition, data, [f"fof_halo_center{x}" for x in "xyz"])
# mark "owned" data with rank (allows differentiating owned and overloaded data)
data["status"] = partition.rank * np.ones(len(data["fof_halo_center_x"]), dtype=np.uint16)
# overload by 4Mpc/h
data = overload(partition, data, 4., [f"fof_halo_center{x}" for x in "xyz"])
# now we can do analysis such as 2pt correlation functions (up to 4Mpc/h)
# or neighbor finding, etc.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for mpipartition-0.2.2-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 8ad635e3296c206f81aa6ceae137c862fd15f1e1ba2d400639c5d8a2824bbbe7 |
|
MD5 | 01bf44db2302b7e89950d6e328f06d9a |
|
BLAKE2b-256 | 20e056f215d01a0a9f979825be88570a079f53737d2046ad5d54d420d80c7e62 |