Skip to main content

The simplest way to utilize multiple threads, processes, and async functions in Python

Project description

Parallely - Parallel Python made simple

pypi License wheel python Test Suite Coverage Status docs

Installation

To install this library, all you have to do is write pip install parallely. The library is really simple with just 3 functions. However, if needed, you can find the docs at ReadTheDocs

Overview

Dealing with multi-threading, parallel processes, and concurrent functions can be difficult in Python. A lot of boiler plate code is needed, and it is difficult to gauge whether it actually improved performance or not.

In some cases, it is necessary to tailor the program to utilize the underlying computational resources. In most cases, we just want to do the same thing many times with small alterations. In these scenarios parallely can make your life much easier.

Multi Threading

Due to the GIL multi threading is far from as useful in Python as it is in other langauges. However, when dealing with I/O intensive applications it can still be really usefull to have multiple threads waiting for a response in parallel instead of waiting for each response sequentially, if you are confused by this, there are plenty of tutorials etc. to help you out

Most of the time we just want to make a series of webrequests. In this case parallely removes all complexity of handling all the threads with a single decorator threaded. Which is easiest to explain with an example:

import time
from parallely import threaded

@threaded
def thread_function(name, duration, other_arg):
    print(f"Thread {name}: starting", other_arg)
    time.sleep(duration)
    print(f"Thread {name}: finishing", other_arg)


print("Synchronous")
thread_function(1, duration=2, other_arg="hello world")
thread_function(2, duration=1, other_arg="hello world")
# NOTICE: We can use the thread_function the exact way we would expect without any overhead

print()
print("Parallel/Asynchrous")
thread_function.map(name=[1, 2], duration=[2, 1], other_arg="hello world")
# NOTICE: the constant given to 'other_arg' is repeated in all function calls
# thread_function.map([1, 2], [2, 1], "hello world") would produce a similar result

As can be seen, the decorated function can be used like one would expect, which makes it easiest to test the function etc. However, it allso gets a .map() attribute, which can be used to run the function in a parallel manner.

Multi Processing

Working with multiple processes is one of the only ways to get around the GIL. However, this approach has all sorts of problems on its own. In many cases transferring the data between processes takes more time than the actual calculations. However, some times it can dramatically speed things up. parallely makes it just as easy to work with multiple processes as it does with threads. Simply use the decorator parallel as can be seen in the example below.

from time import time
from random import randint
from parallely import parallel


@parallel
def count_in_range(size, search_minimum, search_maximum):
    """Returns how many numbers lie within `maximum` and `minimum` in a random array"""
    rand_arr = [randint(0, 10) for _ in range(int(size))] 
    return sum([search_minimum <= n <= search_maximum for n in rand_arr])

size = 1e7

print("Sequential")
start_time = time()
for _ in range(4):
    result = count_in_range(size, search_minimum=1, search_maximum=2)
    print(result, round(time() - start_time, 2), "seconds")

print()

print("Parallel")
start_time = time()
result = count_in_range.map(size=[size, size, size, size], search_minimum=1, search_maximum=2)
print(result, round(time() - start_time, 2), "seconds")

Asynchronous

import asyncio
import time
from random import randint
from parallely import asynced


async def echo(delay, start_time):
    await asyncio.sleep(randint(0, delay))
    print(delay, round(time.time() - start_time, 1))

@asynced
async def main(counts):
    start_time = time.time()
    print(f"started at {time.strftime('%X')}")
    
    corr = []
    for count in range(counts):
        corr.append(echo(count, start_time))
        
    await asyncio.gather(*corr)

    print(f"finished at {time.strftime('%X')}")

# The asynchronous function can now be called in a synchronous manner without awiting it
main(10)

# It can also be called in a parallel manner
main.map([5, 5])

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

parallely-0.2.6.tar.gz (196.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

parallely-0.2.6-py3-none-any.whl (17.6 kB view details)

Uploaded Python 3

File details

Details for the file parallely-0.2.6.tar.gz.

File metadata

  • Download URL: parallely-0.2.6.tar.gz
  • Upload date:
  • Size: 196.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.1.8 CPython/3.9.7 Darwin/20.6.0

File hashes

Hashes for parallely-0.2.6.tar.gz
Algorithm Hash digest
SHA256 90cf3ed4df68568f15add4acacc4d09c5e1a254e6a18efe985676f424a8308c4
MD5 f3dfb3ad305d85276ebbc3715e6987bc
BLAKE2b-256 8d0f9683812e494c74db63710402ddc2c065a2afaf9ed725fcdf095cc72e32bb

See more details on using hashes here.

File details

Details for the file parallely-0.2.6-py3-none-any.whl.

File metadata

  • Download URL: parallely-0.2.6-py3-none-any.whl
  • Upload date:
  • Size: 17.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.1.8 CPython/3.9.7 Darwin/20.6.0

File hashes

Hashes for parallely-0.2.6-py3-none-any.whl
Algorithm Hash digest
SHA256 2855689d84a1dd95237f8d46b8cef4f063c449e5c279b3cb57ebb8eec50c4611
MD5 294c6b3cb82e003eb06fb3ef9c9ed00d
BLAKE2b-256 72e23d9363375c6b81232072734727a8a6fec1c880b7e57219b31404b3118e1c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page