Skip to main content

CPU parallelism for Trio

Project description

Do you have CPU-bound work that just keeps slowing down your Trio event loop no matter what you try? Do you need to get all those cores humming at once? This is the library for you!

The aim of trio-parallel is to use the lightest-weight, lowest-overhead, lowest-latency method to achieve CPU parallelism of arbitrary Python code with a dead-simple API.

Resources

License

MIT -or- Apache License 2.0

Documentation

Documentation

Chat

Chatroom

Forum

Forum

Issues

Issues

Repository

Repository

Tests

Tests

Coverage

Test coverage

Style

Code style

Distribution

Latest Pypi version
Supported Python versions
Supported Python interpreters

Example

import multiprocessing
import trio
import trio_parallel
import time


def hard_work(n, x):
    t = time.perf_counter() + n
    y = x
    while time.perf_counter() < t:
        x = not x
    print(y, "transformed into", x)
    return x


async def too_slow():
    await trio_parallel.run_sync(hard_work, 20, False, cancellable=True)


async def amain():
    t0 = time.perf_counter()
    async with trio.open_nursery() as nursery:
        nursery.start_soon(trio_parallel.run_sync, hard_work, 2, True)
        nursery.start_soon(trio_parallel.run_sync, hard_work, 1, False)
        nursery.start_soon(too_slow)
        result = await trio_parallel.run_sync(hard_work, 1.5, None)
        nursery.cancel_scope.cancel()
    print("got", result, "in", time.perf_counter() - t0, "seconds")
    # prints 2.xxx


if __name__ == "__main__":
    multiprocessing.freeze_support()
    trio.run(amain)

Additional examples and the full API are available in the documentation

Features

  • Bypasses the GIL for CPU-bound work

  • Minimal API complexity

    • looks and feels like Trio threads

  • Minimal internal complexity

    • No reliance on multiprocessing.Pool, ProcessPoolExecutor, or any background threads

  • Cross-platform

  • print just works

  • Automatic, opportunistic use of cloudpickle

  • Automatic LIFO caching of subprocesses

  • Cancel seriously misbehaving code

    • currently via SIGKILL/TerminateProcess

  • Convert segfaults and other scary things to catchable errors

FAQ

How does trio-parallel run Python code in parallel?

Currently, this project is based on multiprocessing subprocesses and has all the usual multiprocessing caveats (freeze_support, pickleable objects only). The case for basing these workers on multiprocessing is that it keeps a lot of complexity outside of the project while offering a set of quirks that users are likely already familiar with. The pickling limitations can be partially alleviated by installing cloudpickle.

Can I have my workers talk to each other?

This is currently possible through the use of multiprocessing.Manager, but we don’t and will not officially support it.

This package focuses on providing a flat hierarchy of worker subprocesses to run synchronous, CPU-bound functions. If you are looking to create a nested hierarchy of processes communicating asynchronously with each other, while preserving the power, safety, and convenience of structured concurrency, look into tractor. Or, if you are looking for a more customized solution, try using trio.run_process to spawn additional Trio runs and have them talk to each other over sockets.

Can I let my workers outlive the main Trio process?

The worker processes are started with the daemon flag for lifetime management, so this use case is not supported.

How should I map a function over a collection of arguments?

This is fully possible but we leave the implementation of that up to you. Think of us as a loky for your joblib, but natively async and Trionic. Some example parallelism patterns can be found in the documentation. Also, look into trimeter?

Contributing

If you notice any bugs, need any help, or want to contribute any code, GitHub issues and pull requests are very welcome! Please read the code of conduct.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

trio-parallel-1.0.0a2.tar.gz (44.1 kB view details)

Uploaded Source

Built Distribution

trio_parallel-1.0.0a2-py3-none-any.whl (32.6 kB view details)

Uploaded Python 3

File details

Details for the file trio-parallel-1.0.0a2.tar.gz.

File metadata

  • Download URL: trio-parallel-1.0.0a2.tar.gz
  • Upload date:
  • Size: 44.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.2 importlib_metadata/4.8.1 pkginfo/1.7.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.9.7

File hashes

Hashes for trio-parallel-1.0.0a2.tar.gz
Algorithm Hash digest
SHA256 75747f1137377c740dbbf00fb1d251a14abb4a7e4df7175acf5674b1542d15f2
MD5 f41e64168393879067057eedda6a6ef9
BLAKE2b-256 f003e97b0056ed19b360722e4288f85b1efac081d5f794201c272a3290c4e31b

See more details on using hashes here.

File details

Details for the file trio_parallel-1.0.0a2-py3-none-any.whl.

File metadata

  • Download URL: trio_parallel-1.0.0a2-py3-none-any.whl
  • Upload date:
  • Size: 32.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.2 importlib_metadata/4.8.1 pkginfo/1.7.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.9.7

File hashes

Hashes for trio_parallel-1.0.0a2-py3-none-any.whl
Algorithm Hash digest
SHA256 54e2fe4f6e0ea912621aca071fa6c5bf8c3a3e4e3b224ff8406bd4afc5b65216
MD5 2eebba8d2ba865dbc79a85e4a789f482
BLAKE2b-256 81718b874ecbd0828de5f8fe2dc2c4891e76ea9284760cc73d990813198a4f76

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page