An extension to PyLops for distributed linear operators.
Project description
:vertical_traffic_light: :vertical_traffic_light: This library is under early development. Expect things to constantly change until version v1.0.0. :vertical_traffic_light: :vertical_traffic_light:
Objective
This library is an extension of PyLops for distributed operators.
As much as numpy and scipy lie at the core of the parent project PyLops, PyLopsdistributed heavily builds on top of Dask, a Python library for distributed computing.
Doing so, linear operators can be distributed across several processes on a single node or even across multiple nodes. Their forward and adjoint are first lazily built as directed acyclic graphs and evaluated only when requested by the user (or automatically within one of our solvers).
Here is a simple example showing how a diagonal operator can be created, applied and inverted using PyLops:
import numpy as np from pylops import Diagonal n = 10 x = np.ones(n) d = np.arange(n) + 1 Dop = Diagonal(d) # y = Dx y = Dop*x # x = D'y xadj = Dop.H*y # xinv = D^1 y xinv = Dop / y
and similarly using PyLopsdistributed:
import numpy as np import dask.array as da import pylops_distributed from pylops_distributed import Diagonal # setup client client = pylops_distributed.utils.backend.dask() n = 10 x = da.ones(n, chunks=(n//2,)) d = da.from_array(np.arange(n) + 1, chunks=(n//2, n//2)) Dop = Diagonal(d) # y = Dx y = Dop*x # x = D'y xadj = Dop.H*y # xinv = D^1 y xinv = Dop / y da.compute((y, xadj, xinv)) client.close()
It is worth noticing two things at this point:
 in this specific case we did not even need to reimplement the
Derivative
operator. Calling numpy operations as methods (e.g.,x.sum()
) instead of functions (e.g.,np.sum(x)
) makes it automatic for our operator to act as a distributed operator when a dask array is provided instead. Unfortunately not all numpy functions are also implemented as methods: in those cases we reimplement the operator directly within PyLopsdistributed.  Using
*
and.H*
is still possible also within PyLopsdistributed, however when initializing an operator we will need to decide whether we want to simply create dask graph or also evaluation. This gives flexibility as we can decide if and when apply evaluation using thecompute
method on a dask array of choice.
Getting started
You need Python 3.5 or greater.
From PyPi
Coming soon...
From Github
You can also directly install from the master node
pip install https://git@github.com/equinor/pylopsdistributed.git@master
Contributing
Feel like contributing to the project? Adding new operators or tutorial?
Follow the instructions from PyLops official documentation.
Documentation
The official documentation of PyLopsdistributed is available here.
Moreover, if you have installed PyLops using the developer environment you can also build the documentation locally by typing the following command:
make doc
Once the documentation is created, you can make any change to the source code and rebuild the documentation by simply typing
make docupdate
Note that if a new example or tutorial is created (and if any change is made to a previously available example or tutorial) you are required to rebuild the entire documentation before your changes will be visible.
History
PyLopsDistributed was initially written and it is currently maintained by Equinor. It is an extension of PyLops for largescale optimization with distributed linear operators that can be tailored to our needs, and as contribution to the free software community.
Contributors
 Matteo Ravasi, mrava87
Project details
Release history Release notifications
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Filename, size  File type  Python version  Upload date  Hashes 

Filename, size pylops_distributed0.1.0py3noneany.whl (47.3 kB)  File type Wheel  Python version py3  Upload date  Hashes View hashes 
Hashes for pylops_distributed0.1.0py3noneany.whl
Algorithm  Hash digest  

SHA256  1dcbd467cce0a0151e260cbd112ed76215bb3c576f24823f54ff09c2b3596e78 

MD5  09a34498dce1cee34615b9659779a8dd 

BLAKE2256  b58f98e435a1d8890792203c7b441d5b91ae5adf7f9990a3f3005e90a2e8ff05 