Skip to main content

multiprocessing without __main__

Project description

multiprocessing without main

pip install multiprocnomain

Advantages:

  • Allows easy parallelization of functions without the need for execution in the '__main__' block or at the top level.
  • Facilitates the parallel execution of functions with different input parameters, providing flexibility.
  • Improves performance by leveraging multiprocessing, especially for computationally intensive tasks.
  • Automatically aggregates results into a dictionary, simplifying result mapping to input indices.
  • Customizable parameters (processes and chunks) to adapt to specific requirements and hardware capabilities.
  • Suitable for various use cases, including data science, machine learning, image processing, and scientific computing.
Parameters:
	- fu (callable): The function to be executed in parallel.
	- it (iterable): An iterable of dictionaries, each containing the input parameters for the function.
	- processes (int): The number of processes to use (default is 3).
	- chunks (int): The chunk size for multiprocessing.Pool.starmap (default is 1).

Returns:
	dict: A dictionary containing the results of the parallel executions, where keys correspond to the indices
		of the input iterable and values contain the corresponding function outputs.

Examples:
	import random
	from multiprocnomain import start_multiprocessing
	import subprocess
	from a_cv_imwrite_imread_plus import open_image_in_cv
	import numpy as np


	def somefu(q=100):
		exec(f"import random", globals())  # necessary
		y = random.randint(10, 20)
		for x in range(q):
			y = y + x

		return y


	# somefu=lambda r:1111
	it = [{"q": 100}, {"q": 100}, {"q": 100}, {"q": 10}]
	b2 = start_multiprocessing(fu=somefu, it=it, processes=3, chunks=1)
	print(b2)


	def somefu2(path):
		exec(f"import subprocess", globals()) # necessary
		y = subprocess.run([f"ls" ,f"{path}"],capture_output=True)
		return y

	allpath=[{'path':'c:\\windows'}, {'path':'c:\\cygwin'}]
	b1 = start_multiprocessing(fu=somefu2, it=allpath, processes=3, chunks=1)
	print(b1)
	def somefu3(q):
		exec(f"from a_cv_imwrite_imread_plus import open_image_in_cv", globals()) # necessary
		exec(f"import numpy as np", globals()) # necessary

		im = open_image_in_cv(q)
		r = im[..., 2]
		g = im[..., 1]
		b = im[..., 0]
		return np.where((r == 255) & (g == 255) & (b == 255))


	allimages = [
		{"q": r"C:\Users\hansc\Pictures\collage_2023_04_23_06_04_51_956747.png"},
		{"q": r"C:\Users\hansc\Pictures\bw_clickabutton.png"},
		{"q": r"C:\Users\hansc\Pictures\cgea.png"},
		{"q": r"C:\Users\hansc\Pictures\checkboxes.png"},
		{"q": r"C:\Users\hansc\Pictures\clickabutton.png"},
		{"q": r"C:\Users\hansc\Pictures\collage_2023_04_23_05_24_31_797203.png"},
		{"q": r"C:\Users\hansc\Pictures\collage_2023_04_23_05_25_48_657510.png"},
		{"q": r"C:\Users\hansc\Pictures\collage_2023_04_23_05_26_16_431863.png"},
		{"q": r"C:\Users\hansc\Pictures\collage_2023_04_23_05_27_07_483808.png"},
		{"q": r"C:\Users\hansc\Pictures\collage_2023_04_23_05_27_41_985343.png"},
		{"q": r"C:\Users\hansc\Pictures\collage_2023_04_23_05_28_16_529438.png"},
		{"q": r"C:\Users\hansc\Pictures\collage_2023_04_23_05_28_55_105250.png"},
		{"q": r"C:\Users\hansc\Pictures\collage_2023_04_23_05_29_11_492492.png"},
		{"q": r"C:\Users\hansc\Pictures\collage_2023_04_23_05_38_13_226848.png"},
		{"q": r"C:\Users\hansc\Pictures\collage_2023_04_23_06_04_14_676085.png"},
		{'q':r"C:\Users\hansc\Downloads\IMG-20230618-WA0000.jpeg"},
		{'q':r"C:\Users\hansc\Downloads\maxresdefault.jpg"},
		{'q':r"C:\Users\hansc\Downloads\panda-with-broom-600x500 (1).jpg"},
		{'q':r"C:\Users\hansc\Downloads\panda-with-broom-600x500.jpg"},
		{'q':r"C:\Users\hansc\Downloads\panda-with-broom-600x500222222222.jpg"},
		{'q':r"C:\Users\hansc\Downloads\pexels-alex-andrews-2295744.jpg"},
		{'q':r"C:\Users\hansc\Downloads\pexels-niki-nagy-1128416.jpg"},

	]
	b = start_multiprocessing(fu=somefu3, it=allimages, processes=3, chunks=5)
	print(b)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

multiprocnomain-0.10.tar.gz (6.8 kB view details)

Uploaded Source

Built Distribution

multiprocnomain-0.10-py3-none-any.whl (8.8 kB view details)

Uploaded Python 3

File details

Details for the file multiprocnomain-0.10.tar.gz.

File metadata

  • Download URL: multiprocnomain-0.10.tar.gz
  • Upload date:
  • Size: 6.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.5

File hashes

Hashes for multiprocnomain-0.10.tar.gz
Algorithm Hash digest
SHA256 c98edb84516d31d1430e3d3de7e6e62419b9c6c38a1defcc432b4ec296dc70c5
MD5 30bc4d26d79d4ee4b07ff25e40b913d4
BLAKE2b-256 c451514846aaf83df631d1c006d20f6abb786e062f771f737a65bf5dfb5c65f8

See more details on using hashes here.

File details

Details for the file multiprocnomain-0.10-py3-none-any.whl.

File metadata

File hashes

Hashes for multiprocnomain-0.10-py3-none-any.whl
Algorithm Hash digest
SHA256 494357d2d2efd901af54915176e9bde91e71a404bfba44fbeadba5f11cee1361
MD5 6fd490d8d029d1470f922291dc2af6f8
BLAKE2b-256 227b354174b5c3a0cd2a5bea6e40795eeded37eb6a2b1e98a654157c31a0ccbc

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page