A Python package for writing multithreaded code.
Project description
Pearpy
The Python package for (pear)allelizing your tasks across multiple CPU threads.
Installation
The latest version of Pearpy can be installed with:
pip install pearpy
To stay up to date with Pearpy's releases, visit the official page on PyPi!
Usage
- Create a
Pear()
object. This will be a wrapper for all of your multithreaded processes. - Identify the functions on which you would like to paralleilze computation.
- Add your tasks to the Pear object. If a potential race condition is detected, an error will be thrown.
- Run the paraellelized processes.
Example
from pearpy.pear import Pear
# First function to be parallelized
def t1(num1, num2):
print('t1: ', num1 + num2)
# Second function to be paralellized
def t2(num):
print('t2: ', num)
# Create pear object, add threads, and run
pear = Pear()
pear.add_thread(t1, [4, 5])
pear.add_thread(t2, 4)
pear.run()
Race Condition Handling
When multiple threads utilize the same function, Pear will automatically generate locks for each resource. This allows developers to utilize Pear's multithreading without having to worry about inaccurate data caused by race conditions. The following example shows how race conditions are handled:
from pearpy.pear import Pear
global_var = 10
# This function reads from and writes to a global variable
def t_duplicated(num):
global global_var
print('t_duplicated: ', num + global_var)
global_var += 1
# Pear object created with two threads accessing a shared resource
# A race condition is detected and locks are generated
pear = Pear()
pear.add_thread(t_duplicated, 1) # This should return 11 because 1 + 10 = 11
pear.add_thread(t_duplicated, 1) # This should return 12 because global_var is incremented
pear.run()
##########
# OUTPUT #
##########
t_duplicated: 11
t_duplicated: 12
Benchmarks and Tests
Benchmarks can be examined via the make benchmark
command. This will display the threaded vs unthreaded runtimes on a set script, along with the percent improvement between the two. Here is an example of what the benchmarks should look like:
----------------------------------------------------------------------
THREADED BENCHMARK
3.8507602214813232 s
----------------------------------------------------------------------
UNTHREADED BENCHMARK
13.90523624420166 s
----------------------------------------------------------------------
Improvement: 361.1036638072611 %
.
----------------------------------------------------------------------
Ran 1 test in 17.757s
OK
To run tests, utilize the make test
command. This will output the results of the functions called in the /tests/test_pear.py
script, along with the status of the tests themselves. The console will display 'OK' if the tests are passing.
Contributing
Pear is open source and contributions from anyone are welcome. To contribute to this project, please submit issues and pull requests via GitHub. In order to successfully merge a pull request, all unit tests must be passed when run via make test
. Thank you!
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
File details
Details for the file pearpy-0.1.3.tar.gz
.
File metadata
- Download URL: pearpy-0.1.3.tar.gz
- Upload date:
- Size: 4.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.4.2 importlib_metadata/4.6.3 pkginfo/1.7.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.0 CPython/3.8.5
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 6ccb01ada483a92e53bd5b662fe65e603ae64de2063f207f61d6c6686a6d91b7 |
|
MD5 | 92b21e2b664d4d9ace429ecba8cf4464 |
|
BLAKE2b-256 | e4fa3847d8ff9b4455835634d3176469e60d14a0b0a9d8df37e76d0f692a49b5 |