Skip to main content

multiprocessing without __main__

Project description

multiprocessing without main

pip install multiprocnomain

Advantages:

  • Allows easy parallelization of functions without the need for execution in the '__main__' block or at the top level.
  • Facilitates the parallel execution of functions with different input parameters, providing flexibility.
  • Improves performance by leveraging multiprocessing, especially for computationally intensive tasks.
  • Automatically aggregates results into a dictionary, simplifying result mapping to input indices.
  • Customizable parameters (processes and chunks) to adapt to specific requirements and hardware capabilities.
  • Suitable for various use cases, including data science, machine learning, image processing, and scientific computing.
Parameters:
	- fu (callable): The function to be executed in parallel.
	- it (iterable): An iterable of dictionaries, each containing the input parameters for the function.
	- processes (int): The number of processes to use (default is 3).
	- chunks (int): The chunk size for multiprocessing.Pool.starmap (default is 1).

Returns:
	dict: A dictionary containing the results of the parallel executions, where keys correspond to the indices
		of the input iterable and values contain the corresponding function outputs.

Examples:
	import random
	from multiprocnomain import start_multiprocessing
	import subprocess
	from a_cv_imwrite_imread_plus import open_image_in_cv
	import numpy as np


	def somefu(q=100):
		exec(f"import random", globals())  # necessary
		y = random.randint(10, 20)
		for x in range(q):
			y = y + x

		return y


	# somefu=lambda r:1111
	it = [{"q": 100}, {"q": 100}, {"q": 100}, {"q": 10}]
	b2 = start_multiprocessing(fu=somefu, it=it, processes=3, chunks=1)
	print(b2)


	def somefu2(path):
		exec(f"import subprocess", globals()) # necessary
		y = subprocess.run([f"ls" ,f"{path}"],capture_output=True)
		return y

	allpath=[{'path':'c:\\windows'}, {'path':'c:\\cygwin'}]
	b1 = start_multiprocessing(fu=somefu2, it=allpath, processes=3, chunks=1)
	print(b1)
	def somefu3(q):
		exec(f"from a_cv_imwrite_imread_plus import open_image_in_cv", globals()) # necessary
		exec(f"import numpy as np", globals()) # necessary

		im = open_image_in_cv(q)
		r = im[..., 2]
		g = im[..., 1]
		b = im[..., 0]
		return np.where((r == 255) & (g == 255) & (b == 255))


	allimages = [
		{"q": r"C:\Users\hansc\Pictures\collage_2023_04_23_06_04_51_956747.png"},
		{"q": r"C:\Users\hansc\Pictures\bw_clickabutton.png"},
		{"q": r"C:\Users\hansc\Pictures\cgea.png"},
		{"q": r"C:\Users\hansc\Pictures\checkboxes.png"},
		{"q": r"C:\Users\hansc\Pictures\clickabutton.png"},
		{"q": r"C:\Users\hansc\Pictures\collage_2023_04_23_05_24_31_797203.png"},
		{"q": r"C:\Users\hansc\Pictures\collage_2023_04_23_05_25_48_657510.png"},
		{"q": r"C:\Users\hansc\Pictures\collage_2023_04_23_05_26_16_431863.png"},
		{"q": r"C:\Users\hansc\Pictures\collage_2023_04_23_05_27_07_483808.png"},
		{"q": r"C:\Users\hansc\Pictures\collage_2023_04_23_05_27_41_985343.png"},
		{"q": r"C:\Users\hansc\Pictures\collage_2023_04_23_05_28_16_529438.png"},
		{"q": r"C:\Users\hansc\Pictures\collage_2023_04_23_05_28_55_105250.png"},
		{"q": r"C:\Users\hansc\Pictures\collage_2023_04_23_05_29_11_492492.png"},
		{"q": r"C:\Users\hansc\Pictures\collage_2023_04_23_05_38_13_226848.png"},
		{"q": r"C:\Users\hansc\Pictures\collage_2023_04_23_06_04_14_676085.png"},
		{'q':r"C:\Users\hansc\Downloads\IMG-20230618-WA0000.jpeg"},
		{'q':r"C:\Users\hansc\Downloads\maxresdefault.jpg"},
		{'q':r"C:\Users\hansc\Downloads\panda-with-broom-600x500 (1).jpg"},
		{'q':r"C:\Users\hansc\Downloads\panda-with-broom-600x500.jpg"},
		{'q':r"C:\Users\hansc\Downloads\panda-with-broom-600x500222222222.jpg"},
		{'q':r"C:\Users\hansc\Downloads\pexels-alex-andrews-2295744.jpg"},
		{'q':r"C:\Users\hansc\Downloads\pexels-niki-nagy-1128416.jpg"},

	]
	b = start_multiprocessing(fu=somefu3, it=allimages, processes=3, chunks=5)
	print(b)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

multiprocnomain-0.11.tar.gz (6.8 kB view details)

Uploaded Source

Built Distribution

multiprocnomain-0.11-py3-none-any.whl (8.9 kB view details)

Uploaded Python 3

File details

Details for the file multiprocnomain-0.11.tar.gz.

File metadata

  • Download URL: multiprocnomain-0.11.tar.gz
  • Upload date:
  • Size: 6.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.5

File hashes

Hashes for multiprocnomain-0.11.tar.gz
Algorithm Hash digest
SHA256 29c7e66c93ffdadc2808fa873ab1c53ebff34c2fbeb47becb6c0c4610d762333
MD5 8aad9246e97e7aa365e8ed9e9538b6a8
BLAKE2b-256 63fcabb28cd597b8d575561ebf6fa44652f4bd125540e6f35b48ed1d6bbce053

See more details on using hashes here.

File details

Details for the file multiprocnomain-0.11-py3-none-any.whl.

File metadata

File hashes

Hashes for multiprocnomain-0.11-py3-none-any.whl
Algorithm Hash digest
SHA256 cf7dd879aefa2721365d4d1dc2ee960104196646839ac3757cd8064ba956bbe8
MD5 7527faf635cb03ec57e3724b38194edf
BLAKE2b-256 5becd9e80be5072e25636f9ade34c07f516c7733cbb788d948fd2a0cdb731a94

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page