Python bindings for MPI
Project description
mpi4py-ve is an extension to mpi4py, which provides Python bindings for the Message Passing Interface (MPI). This package also supports to communicate array objects of NLCPy (nlcpy.ndarray) between MPI processes on x86 servers of SX-Aurora TSUBASA systems. Combining NLCPy with mpi4py-ve enables Python scripts to utilize multi-VE computing power. The current version of mpi4py-ve is based on mpi4py version 3.0.3. For details of API references, please refer to mpi4py manual.
Requirements
Before the installation, the following components are required to be installed on your x86 Node of SX-Aurora TSUBASA.
Install from wheel
You can install mpi4py-ve by executing either of the following commands.
Install from PyPI
$ pip install mpi4py-ve
Install from your local computer
Download the wheel package from GitHub.
Put the wheel package to your any directory.
Install the local wheel package via pip command.
$ pip install <path_to_wheel>
- The shared objects for Vector Engine, which are included in the wheel package, are compiled and tested by using following software:
NEC C/C++ Compiler
Version 3.2.1
NEC MPI
v2.20.0
NumPy
v1.19.2
NLCPy
v2.1.1
Install from source (with building)
Before building this package, you need to execute the environment setup script necmpivars.sh or necmpivars.csh once advance.
When using sh or its variant:
$ source /opt/nec/ve/mpi/X.X.X/bin/necmpivars.sh
When using csh or its variant:
$ source /opt/nec/ve/mpi/X.X.X/bin/necmpivars.csh
Here, X.X.X denotes the version number of NEC MPI.
After that, execute the following commands:
$ git clone https://github.com/SX-Aurora/mpi4py-ve.git $ cd mpi4py-ve $ python setup.py build --mpi=necmpi $ python setup.py install
Example
Transfer Array
Transfers an NLCPy’s ndarray from MPI rank 0 to 1 by using comm.Send() and comm.Recv():
from mpi4pyve import MPI
import nlcpy as vp
comm = MPI.COMM_WORLD
size = comm.Get_size()
rank = comm.Get_rank()
if rank == 0:
x = vp.array([1,2,3], dtype=int)
comm.Send(x, dest=1)
elif rank == 1:
y = vp.empty(3, dtype=int)
comm.Recv(y, source=0)
Sum of Numbers
Sums the numbers locally, and reduces all the local sums to the root rank (rank=0):
from mpi4pyve import MPI
import nlcpy as vp
comm = MPI.COMM_WORLD
size = comm.Get_size()
rank = comm.Get_rank()
N = 1000000000
begin = N * rank // size
end = N * (rank + 1) // size
sendbuf = vp.arange(begin, end).sum()
recvbuf = comm.reduce(sendbuf, MPI.SUM, root=0)
The following table shows the performance results[msec] on VE Type 20B:
np=1 |
np=2 |
np=3 |
np=4 |
np=5 |
np=6 |
np=7 |
np=8 |
35.8 |
19.0 |
12.6 |
10.1 |
8.1 |
7.0 |
6.0 |
5.5 |
Execution
When executing Python script using mpi4py-ve, use mpirun command of NEC MPI on an x86 server of SX-Aurora TSUBASA. Before running the Python script, you need to execute the environment the following setup scripts once advance.
When using sh or its variant:
$ source /opt/nec/ve/mpi/X.X.X/bin/necmpivars.sh gnu 4.8.5 $ source /opt/nec/ve/nlc/Y.Y.Y/bin/nlcvars.sh
When using csh or its variant:
$ source /opt/nec/ve/mpi/X.X.X/bin/necmpivars.csh gnu 4.8.5 $ source /opt/nec/ve/nlc/Y.Y.Y/bin/nlcvars.csh
Here, X.X.X and Y.Y.Y denote the version number of NEC MPI and NLC, respectively.
When using the mpirun command:
$ mpirun -veo -np N $(which python) sample.py
$ export NMPI_USE_COMMAND_SEARCH_PATH=ON $ mpirun -veo -np N python sample.py
Other Documents
Below links would be useful to understand mpi4py-ve in more detail:
Restriction
The value specified by np must not exceed the number of VE cards.
The current version of mpi4py-ve does not support some functions that are listed in the section “List of Unsupported Functions” of mpi4py-ve tutorial.
License
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
Built Distributions
File details
Details for the file mpi4py_ve-0.1.0b1-cp38-cp38-manylinux1_x86_64.whl
.
File metadata
- Download URL: mpi4py_ve-0.1.0b1-cp38-cp38-manylinux1_x86_64.whl
- Upload date:
- Size: 3.3 MB
- Tags: CPython 3.8
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/34.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.9 tqdm/4.63.1 importlib-metadata/4.11.3 keyring/23.5.0 rfc3986/2.0.0 colorama/0.4.4 CPython/3.8.13
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 8f97b94d5d5beb879d568e79a662d42009c98474240e21b314adc34cdf5f0c8e |
|
MD5 | e44329deb4e90b0b3cfe6ac283ac4e0b |
|
BLAKE2b-256 | a20512fea13056b61dfba7283506ee12e2f53a97c2f281622b676a237836d26d |
File details
Details for the file mpi4py_ve-0.1.0b1-cp37-cp37m-manylinux1_x86_64.whl
.
File metadata
- Download URL: mpi4py_ve-0.1.0b1-cp37-cp37m-manylinux1_x86_64.whl
- Upload date:
- Size: 3.1 MB
- Tags: CPython 3.7m
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/34.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.9 tqdm/4.63.0 importlib-metadata/4.11.3 keyring/23.5.0 rfc3986/2.0.0 colorama/0.4.4 CPython/3.7.13
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 40b32cb885ac1e31225c1d32100e74c525ec3939719d4a109092058bc2b28c33 |
|
MD5 | 6900b5df20c2c9913ba1876cff439866 |
|
BLAKE2b-256 | 7bb4d2457da192962847199e23fd52327f55689dff8e4788f97aba18c1686688 |
File details
Details for the file mpi4py_ve-0.1.0b1-cp36-cp36m-manylinux1_x86_64.whl
.
File metadata
- Download URL: mpi4py_ve-0.1.0b1-cp36-cp36m-manylinux1_x86_64.whl
- Upload date:
- Size: 3.1 MB
- Tags: CPython 3.6m
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/34.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.9 tqdm/4.63.0 importlib-metadata/4.8.3 keyring/23.4.1 rfc3986/1.5.0 colorama/0.4.4 CPython/3.6.15
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 8042912c721688b65c57b129e44173d7596ef5296afb2ec5c3ef81a8cae38ad0 |
|
MD5 | d62f4c30f556a15a9f7233f9b979dd2a |
|
BLAKE2b-256 | 7791eb7104e74479794289a24c2ab21f366ca892d5a0455c055099cad7174c61 |