Tools to run asyncio tasks concurrently.
Project description
asyncio-concurrent-tasks
Tooling to run asyncio tasks.
- Background task
- Periodic task
- Thread safe task pool
- Threaded task pool
- Restartable task
- Loop exception handler
Background task
Task that is running in the background until cancelled. Can be used as a context manager.
Example usage:
import asyncio
from typing import Callable, Awaitable
from concurrent_tasks import BackgroundTask
class HeartBeat(BackgroundTask):
def __init__(self, interval: float, func: Callable[[], Awaitable]):
super().__init__(self._run, interval, func)
async def _run(self, interval: float, func: Callable[[], Awaitable]) -> None:
while True:
await func()
await asyncio.sleep(interval)
Periodic task
Task that is running periodically in the background until cancelled. Can be used as a context manager. There is no guarantee that the time between calls is strictly the interval if the function takes more time than the interval to execute.
Example usage:
from typing import Callable, Awaitable
from concurrent_tasks import PeriodicTask
class HeartBeat(PeriodicTask):
def __init__(self, interval: float, func: Callable[[], Awaitable]):
super().__init__(interval, func)
Thread safe task pool
The goal is to be able to safely run tasks from other threads.
Parameters:
size
can be a positive integer to limit the number of tasks concurrently running.timeout
can be set to define a maximum running time for each time after which it will be cancelled. Note: this excludes time spent waiting to be started (time spent in the buffer).
Example usage:
from concurrent_tasks import ThreadSafeTaskPool
async def func():
...
async with ThreadSafeTaskPool() as pool:
# Create and run the task.
result = await pool.run(func())
# Create a task, the `concurrent.Future` will hold information about completion.
future = pool.create_task(func())
Threaded task pool
Run async tasks in a dedicated thread. It will have its own event loop.
Under the hook, ThreadSafeTaskPool
is used.
Parameters:
name
will be used as the thread's name.size
andtimeout
seeThreadSafeTaskPool
.context_manager
can be optional context managers that will be entered when the loop has started and exited before the loop is stopped.
💡 All tasks will be completed when the pool is stopped.
💡 Blocking and async version are the same, prefer the async version if client code is async.
Loop initialization
⚠️ Asyncio primitives need to be instantiated with the proper event loop.
To achieve that, use a context manager wrapping instantiation of objects:
from functools import partial
from concurrent_tasks import ThreadedPoolContextManagerWrapper, AsyncThreadedTaskPool
pool = AsyncThreadedTaskPool(context_manager=ThreadedPoolContextManagerWrapper(partial(MyObj, ...)))
Blocking
This can be used to run async functions in a dedicated event loop, while keeping it running to handle background tasks
Example usage:
from concurrent_tasks import BlockingThreadedTaskPool
async def func():
...
with BlockingThreadedTaskPool() as pool:
# Create and run the task.
result = pool.run(func())
# Create a task, the `concurrent.Future` will hold information about completion.
future = pool.create_task(func())
Async
Threads can be useful in cooperation with asyncio to let the OS guarantee fair resource distribution between threads. This is especially useful in case you cannot know if called code will properly cooperate with the event loop.
Example usage:
from concurrent_tasks import AsyncThreadedTaskPool
async def func():
...
async with AsyncThreadedTaskPool() as pool:
# Create and run the task.
result = await pool.run(func())
# Create a task, the asyncio.Future will hold information about completion.
future = pool.create_task(func())
Restartable task
Task that can be started and cancelled multiple times until it can finally be completed. This is useful to handle pauses and retries when handling with a connection.
💡 Use
functools.partial
to pass parameters to the function.
Example usage:
from functools import partial
from concurrent_tasks import RestartableTask
async def send(data): ...
task: RestartableTask[int] = RestartableTask(partial(send, b"\x00"), timeout=1)
task.start()
assert await task == 1
# Running in other tasks:
# On connection lost:
task.cancel()
# On connection resumed:
task.start()
# On response received:
task.set_result(1)
Loop exception handler
Shut down process when an unhandled exception is caught or a signal is received.
To make this a graceful stop, pass a stop_func
.
When creating multiple background tasks, exceptions raised within those will be forwarded directly to the event loop.
In order to act on those exceptions, we need to use loop.set_exception_handler
.
💡 When a signal is received and the process is already shutting down, it will be force killed.
Example minimalistic implementation:
import asyncio
from concurrent_tasks import LoopExceptionHandler
async def run():
event = asyncio.Event()
tasks = []
async def _stop():
await asyncio.gather(*tasks)
event.set()
async with LoopExceptionHandler(_stop):
# Adding a bunch of tasks here...
await event.wait()
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file concurrent_tasks-1.8.0.tar.gz
.
File metadata
- Download URL: concurrent_tasks-1.8.0.tar.gz
- Upload date:
- Size: 10.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.4 CPython/3.13.0 Linux/6.5.0-1025-azure
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 2fd320780595100520274274be095a382801d077566c8fb67d1ae43b067af62a |
|
MD5 | bcec5e2837f5a5854d2c149d0d75d20e |
|
BLAKE2b-256 | 92725829b962efafdfcdb5d3e604a795535e63e54b5cd4955b172933f9934aa3 |
File details
Details for the file concurrent_tasks-1.8.0-py3-none-any.whl
.
File metadata
- Download URL: concurrent_tasks-1.8.0-py3-none-any.whl
- Upload date:
- Size: 13.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.4 CPython/3.13.0 Linux/6.5.0-1025-azure
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 98d2ef465bcc4fbcbbdc5010be2eca05b18e391a9c2e0397c6c849dab410c522 |
|
MD5 | 08e57171b652da5d0be87aff5b7cb0ba |
|
BLAKE2b-256 | 51c8322f1ff19d32dbbd48200f81286470e880be0c6240de705a7a4c0a0bfbe9 |