Skip to main content

Takes the hassle away from multiprocessing of functions and parsing of big dataset`s.

Project description

Python ProcessIO

Running functions with multiprocessing hassle free

PyPI - Format PyPI - Status Downloads PyPI - Python Version

A nice package to help you run functions with multiprocessing and get their result.

Installation

pip install processio

Usage

Run function in own process
import time
from processio import ProcessIO


def get_company_name():
    # Do work in own process
    time.sleep(5)
    return 'Adapted'


def do_some_work():
    get_name = ProcessIO(get_company_name)

    # do stuff or run a while loop to wait for result

    while get_name.doing_work():
        print('Waiting for process to finnish')

    # You can also call .result() and the main thread will wait 
    # for the thread to return your result.

    company_name = get_name.result()

    print(company_name) # Outputs -> Adapted


if __name__ == '__main__': # <- its required to execute the function main
    do_some_work() 

Main commands

# Import the module
from processio import ProcessIO

# Start your function with or without arguments
var = ProcessIO(function, args, kwargs)

# Wait for the function to finnish or get the result if its finished
var.result()

Optional commands

# Check if your thread is still working on the function.
# This will return True if the function is not completed.

var.doing_work()

Use parseIO to save a lot of time on list parsing

import time
from processio import ParseIO


def list_parser(list):
    result = 0
    for line in list:
        result += get_total_amout(line)
    return result


def do_some_work():
    # this will split the list into 4 and run the function on
    # 4 different processes, that in most cases will almost speed
    # up the work time by 4.

    # you can define the number of processes you want to run, but as a
    # default the module runs on the systems cpu cores - 1

    parser = ParseIO(list_parser, huge_list)

    # do stuff or run a while loop to wait for result

    while parser.doing_work():
        print('Waiting for processes to finnish')

    # You can also call .result() and the main thread will wait 
    # for the thread to return your result.

    result = parser.result()

    # result comes back as list pr process, so in this case we will get 
    # a list with 4 numbers that we can loop thru.

    print(result) # Outputs -> [1000, 1000, 1000, 1000] <- example


    total = 0
    for res in result:
        total += res


    print(total) # Outputs -> 4000 <- example

if __name__ == '__main__': # <- its required to execute the function main
    do_some_work() 

Testing

Use the following command to run tests.

python -m unittest threadit.tests.test_threadit

Changelog:

See CHANGELOG.md

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

processio-1.2.0.tar.gz (3.7 kB view details)

Uploaded Source

Built Distribution

processio-1.2.0-py3-none-any.whl (4.2 kB view details)

Uploaded Python 3

File details

Details for the file processio-1.2.0.tar.gz.

File metadata

  • Download URL: processio-1.2.0.tar.gz
  • Upload date:
  • Size: 3.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.24.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.46.1 CPython/3.8.3

File hashes

Hashes for processio-1.2.0.tar.gz
Algorithm Hash digest
SHA256 15c981b3c44a2650b5b58b196543a73cccb81ad39b191cfccb1af7f4fe6980aa
MD5 8b0ef082c8668cdadb5bb1467fbcda4d
BLAKE2b-256 9631150dd1bffdff56253a4ae2e0e0ecd7e47fdc5b950764790635517dc8a299

See more details on using hashes here.

File details

Details for the file processio-1.2.0-py3-none-any.whl.

File metadata

  • Download URL: processio-1.2.0-py3-none-any.whl
  • Upload date:
  • Size: 4.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.24.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.46.1 CPython/3.8.3

File hashes

Hashes for processio-1.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 cea098a847f97182fd4c30bab1af562bab8c25b01cfe60f1f477103664295913
MD5 a6bf38764ef064f370f6126f13e64279
BLAKE2b-256 c5832c38bc0af1227ccbc14d104b2ecdb947b6e21fc904c20a7bde949a95815e

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page