Skip to main content

Package to read SWIFT simulation snapshots in MPI.

Project description

An MPI read routine for Swift simulation snapshots

pyread_swift is an MPI read routine for swiftsim snapshots, very similar in style to John Helly's read_eagle code to read EAGLE snapshots.

The package can read swiftsim snapshots both in "collective" (i.e., multiple MPI ranks read from a single file simultaneously) and "distributed" (i.e., each MPI reads an individual snapshot file part in isolation) modes.

Installation

Requirements

  • OpenMPI or other MPI library
  • python>=3.8

Recommended modules when working on COSMA7:

module load gnu_comp/11.1.0 openmpi/4.1.4 parallel_hdf5/1.12.0 python/3.9.1-C7

Given the need for a parallel HDF5 installation, it is recommended you install pyread_swift within a virtual/conda environment. However you can ofcourse also install directly into your base Python environment if you prefer.

First make sure your pip is up-to-date:

python3 -m pip install --upgrade pip

Method 1) Installation from PyPi

The easiest method is to install from PyPI

python3 -m pip install pyread-swift

Method 2) Installation from source

Or, you can install directly from source.

First clone the repo, then you can install the pyread_swift package by typing the following in the root git directory:

git clone https://github.com/stuartmcalpine/pyread_swift.git
cd pyread_swift
python3 -m pip install .

which will install pyread_swift and any dependencies.

MPI installation for collective reading

If you are using pyread_swift to load large snapshots over MPI collectively (i.e., multiple cores read in parallel from the same file), a bit of additional setup is required.

Make sure you have hdf5 installed with parallel compatibility (see here for details).

Then, uninstall any versions of h5py and reinstall from source:

python3 -m pip uninstall h5py
MPICC=mpicc CC=mpicc HDF5_MPI="ON" python3 -m pip install --no-binary=h5py h5py

If pip struggles to find your HDF5 libraries automatically, e.g., error: libhdf5.so: cannot open shared object file: No such file or directory. You may have to specify the path to the HDF5 installation manually, i.e., HDF5_DIR=/path/to/hdf5/lib (see here for more details).

For our COSMA7 setup, that would be:

HDF5_DIR="/cosma/local/parallel-hdf5//gnu_11.1.0_ompi_4.1.4/1.12.0/"

Usage

pyread_swift is build around a primary read wrapper, called SwiftSnapshot. The snapshot particles are loaded into, stored, and manipulated by this object.

Reading follows these four steps (see also the examples below):

  • Initialize a SwiftSnapshot object pointing to the location of the HDF5 file.

  • Select the spatial region you want to extract the particles from using the select_region() routine.

  • Split the selection over the MPI ranks using the split_selection() routine.

  • Read a selected property of the particles using the read_dataset() routine.

Input parameters to SwiftSnapshot

Input Description Default option
fname Full path to HDF5 snapshot file. If the snapshot is split over multiple files, this can just be one of the file parts -
comm= MPI4PY communicator (if reading in MPI) None
verbose= True for more a more verbose output False
mpi_read_format= How to read the snapshot in MPI mode ("collective" or "distributed")

"collective": Do a collective read of each file, i.e., all ranks read a single file at one. Recommended for single, or few large snapshot file(s). Requires parallel-hdf5 to be installed.

"distributed": Each rank reads its own file part. Recommended for multiple smaller files.
"collective"
max_concur_io= When reading in MPI, how many HDF5 files can be open at once 64

Example usage (No MPI case)

from pyread_swift import SwiftSnapshot

# Set up pyread_swift object pointing at HDF5 snapshot file (or a file part). 
snapshot = "/path/to/snap/part.0.hdf5"
swift = SwiftSnapshot(snapshot)

# Select region to load from.
parttype = 1 # Dark matter
region = [0,100,0,100,0,100] # [xlo,xhi,ylo,yhi,zlo,zhi]
swift.select_region(parttype, *region)

# Divide selection between ranks (needs to be invoked even for non-mpi case).
swift.split_selection()

# Read data.
ids = swift.read_dataset(parttype, "ParticleIDs")

Example usage (MPI case)

from mpi4py import MPI
from pyread_swift import SwiftSnapshot

# MPI communicator.
comm = MPI.COMM_WORLD

# Set up read_swift object pointing at HDF5 snapshot file (or a file part). 
snapshot = "/path/to/snap/part.0.hdf5"
swift = SwiftSnapshot(snapshot, comm=comm)

# Select region to load from.
parttype = 1 # Dark matter
region = [0,100,0,100,0,100] # [xlo,xhi,ylo,yhi,zlo,zhi]
swift.select_region(parttype, *region)

# Divide selection between ranks.
swift.split_selection()

# Read data.
ids = swift.read_dataset(parttype, "ParticleIDs")

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pyread_swift-1.1.0.tar.gz (24.3 kB view details)

Uploaded Source

Built Distribution

pyread_swift-1.1.0-py3-none-any.whl (25.7 kB view details)

Uploaded Python 3

File details

Details for the file pyread_swift-1.1.0.tar.gz.

File metadata

  • Download URL: pyread_swift-1.1.0.tar.gz
  • Upload date:
  • Size: 24.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.1 CPython/3.12.5

File hashes

Hashes for pyread_swift-1.1.0.tar.gz
Algorithm Hash digest
SHA256 bb7646b7354e52604ec8ee8f8e7e1763782e09fac1f36c8e87f2fd006160e21f
MD5 a24cd89f80e6b96d698ccb470115cc37
BLAKE2b-256 1f6eba08b353493321dc76d75b1d88f0e75d0499be76a819a03f86f60429ed74

See more details on using hashes here.

File details

Details for the file pyread_swift-1.1.0-py3-none-any.whl.

File metadata

  • Download URL: pyread_swift-1.1.0-py3-none-any.whl
  • Upload date:
  • Size: 25.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.1 CPython/3.12.5

File hashes

Hashes for pyread_swift-1.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 28afc3d2ff10c9c24b55935297b6e99091d20cbfbb3f47df09195e743d539452
MD5 0400a89883baff9113ca9f52713f83bc
BLAKE2b-256 85a193541360d3fa624755d03894d3fe61a11e775016bb75f6d775bd7d9e46a5

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page