Skip to main content

A package to mimic the use of parfor as done in Matlab.

Project description

Parfor

Used to parallelize for-loops using parfor in Matlab? This package allows you to do the same in python. Take any normal serial but parallelizable for-loop and execute it in parallel using easy syntax. Don't worry about the technical details of using the multiprocessing module, race conditions, queues, parfor handles all that.

Tested on linux on python 2.7 and 3.8 and on Windows and OSX on python 3.8.

Why is parfor better than just using multiprocessing?

  • Easy to use
  • Using dill instead of pickle: a lot more objects can be used when parallelizing
  • Progress bars are built-in

Installation

pip install parfor

Usage

Parfor decorates a functions and returns the result of that function evaluated in parallel for each iteration of an iterator.

Requires

tqdm, dill

Limitations

Objects passed to the pool need to be dillable (dill needs to serialize them). Generators and SwigPyObjects are examples of objects that cannot be used. They can be used however, for the iterator argument when using parfor, but its iterations need to be dillable. You might be able to make objects dillable anyhow using dill.register.

The function evaluated in parallel needs to terminate. If parfor hangs after seeming to complete the task, it probably is because the individual processes cannot terminate. Importing javabridge (used in python-bioformats) and starting the java virtual machine can cause it to hang since the processes only terminate after the java vm has quit. In this case, pass terminator=javabridge.kill_vm to parfor.

On OSX the buffer bar does not work due to limitations of the OS.

Arguments

Required:

fun:      function taking arguments: iteration from  iterable, other arguments defined in args & kwargs
iterable: iterable from which an item is given to fun as a first argument

Optional:

args:   tuple with other unnamed arguments to fun
kwargs: dict with other named arguments to fun
length: give the length of the iterator in cases where len(iterator) results in an error
desc:   string with description of the progress bar
bar:    bool enable progress bar
pbar:   bool enable buffer indicator bar
rP:     ratio workers to cpu cores, default: 1
nP:     number of workers, default, None, overrides rP if not None
    number of workers will always be at least 2
serial: switch to serial if number of tasks less than serial, default: 4
debug:  if an error occurs in an iteration, return the erorr instead of retrying in the main process

Return

list with results from applying the decorated function to each iteration of the iterator
specified as the first argument to the function

Examples

Normal serial for loop

<<
from time import sleep

a = 3
fun = []
for i in range(10):
    sleep(1)
    fun.append(a*i**2)
print(fun)

>> [0, 3, 12, 27, 48, 75, 108, 147, 192, 243]

Using parfor to parallelize

<<
from time import sleep
from parfor import parfor
@parfor(range(10), (3,))
def fun(i, a):
    sleep(1)
    return a*i**2
print(fun)

>> [0, 3, 12, 27, 48, 75, 108, 147, 192, 243]

<<
@parfor(range(10), (3,), bar=False)
def fun(i, a):
    sleep(1)
    return a*i**2
print(fun)

>> [0, 3, 12, 27, 48, 75, 108, 147, 192, 243]

Using parfor in a script/module/.py-file

Parfor should never be executed during the import phase of a .py-file. To prevent that from happening use the if __name__ == '__main__': structure:

<<
from time import sleep
from parfor import parfor

if __name__ == '__main__':
    @parfor(range(10), (3,))
    def fun(i, a):
        sleep(1)
        return a*i**2
    print(fun)

>> [0, 3, 12, 27, 48, 75, 108, 147, 192, 243]    

or:

<<
from time import sleep
from parfor import parfor

def my_fun(*args, **kwargs):
    @parfor(range(10), (3,))
    def fun(i, a):
        sleep(1)
        return a*i**2
    return fun

if __name__ == '__main__':
    print(my_fun())

>> [0, 3, 12, 27, 48, 75, 108, 147, 192, 243]

If you hate decorators not returning a function

pmap maps an iterator to a function like map does, but in parallel

<<
from parfor import pmap
from time import sleep
def fun(i, a):
    sleep(1)
    return a*i**2
print(pmap(fun, range(10), (3,)))

>> [0, 3, 12, 27, 48, 75, 108, 147, 192, 243]     

Using generators

If iterators like lists and tuples are too big for the memory, use generators instead. Since generators don't have a predefined length, give parfor the length as an argument (optional).

<<
import numpy as np
c = (im for im in imagereader)
@parfor(c, length=len(imagereader))
def fun(im):
    return np.mean(im)
    
>> [list with means of the images]

Extra's

Pmap

The function parfor decorates, use it like map.

Chunks

Split a long iterator in bite-sized chunks to parallelize

Parpool

More low-level accessibility to parallel execution. Submit tasks and request the result at any time, (although necessarily submit first, then request a specific task), use different functions and function arguments for different tasks.

Tqdmm

Meter bar, inherited from tqdm, used for displaying buffers.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

parfor-2022.3.0.tar.gz (22.9 kB view hashes)

Uploaded Source

Built Distribution

parfor-2022.3.0-py3-none-any.whl (21.9 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page