Quick parallel processing library
Project description
This library is used to run custom functions using threads or process pool executor.
Requirements
tqdm (this will be installed automatically if missing)
Python 3.4 or higher
Installation
Install using pip:
pip install quick_parallel
Main Methods
quick_parallel: A function that splits multiple processes by either running them in parallel processes or using threads.
Usage
from quick_parallel.process import quick_parallel
def worker(x):
"""
Function to compute x^3 * 2.
"""
return x ** 3 * 2
def worker_with_arguments(x, p):
"""
Function that raises a number `x` to the power of `p`.
:param x: Base number
:param p: Power to which `x` is raised
:return: Result of x raised to the power of p
"""
return x ** p
if __name__ == "__main__":
# Example: Simple worker without arguments
gen_d = range(5) # Replace with your actual generator or iterable
# Running in a parallel process with 2 workers and disabling threading
lm = quick_parallel(worker, gen_d, use_thread=False, n_workers=2,
progress_message="Running in simple worker:")
# Collect and print the results
res = [i for i in lm]
print(res)
# Output: [0, 2, 16, 54, 128]
# Example: Worker with an extra positional argument
arg = 9 # Example positional argument (power)
ext_arg = quick_parallel(worker_with_arguments, gen_d, arg, use_thread=False, n_workers=2,
progress_message="Running worker_with_arguments function:")
print(list(ext_arg))
# Output: [0, 1, 512, 19683, 262144]
# Change the argument and check the updated results
arg = 10 # Example positional argument (power)
ext_arg = quick_parallel(worker_with_arguments, gen_d, arg, use_thread=False, n_workers=2,
progress_message="Running worker_with_arguments function:")
print(list(ext_arg))
# Output: [0, 1, 1024, 59049, 1048576]
Notes
When using threads (use_threads=True), the function utilizes ThreadPoolExecutor for parallel processing.
If n_workers is not specified, the function defaults to using 40% of available CPU cores.
Progress information is displayed during execution.
Always run the code under if __name__ == ‘__main__’: to prevent multiprocessing issues.
Define worker functions in a separate script and import them into the processing script for better modularity.
Pass a generator as the iterator (instead of a list, tuple, or numpy array) to save memory.
If the function returns large datasets (e.g., DataFrames), store the data in a file (e.g., SQL database) to save memory and avoid performance slowdowns.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file quick_parallel-0.0.1.3.tar.gz.
File metadata
- Download URL: quick_parallel-0.0.1.3.tar.gz
- Upload date:
- Size: 5.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.12.2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
7f33708953b262a79ea0d7dab5c1aeec0d423ba27e808bd8e0f8a371c5da56bd
|
|
| MD5 |
d716a357c9f4fd15590f04b0f8f793e9
|
|
| BLAKE2b-256 |
112f52c2dbdace5ffa6e8d74b9a73560f8ed63429701b50128d28387e669c77f
|
File details
Details for the file quick_parallel-0.0.1.3-py3-none-any.whl.
File metadata
- Download URL: quick_parallel-0.0.1.3-py3-none-any.whl
- Upload date:
- Size: 6.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.12.2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
abb369d1acdfd4c8510f8e37aa1324a8ffccdde04802907f2d1be5af1c91a7cd
|
|
| MD5 |
ea3931e2221e052cc13b6499a7d35bdc
|
|
| BLAKE2b-256 |
4f7315915dc4c5e4a4edc7f6cb6ea2e7682e96eef5878ff85ce1dc1aa6afe7de
|