Tools to run asyncio tasks concurrently.
Project description
asyncio-concurrent-tasks
Tooling to run asyncio tasks.
- Background task
- Periodic task
- Thread safe task pool
- Threaded task pool
- Restartable task
- Loop exception handler
Background task
Task that is running in the background until cancelled. Can be used as a context manager.
Example usage:
import asyncio
from typing import Callable, Awaitable
from concurrent_tasks import BackgroundTask
class HeartBeat(BackgroundTask):
def __init__(self, interval: float, func: Callable[[], Awaitable]):
super().__init__(self._run, interval, func)
async def _run(self, interval: float, func: Callable[[], Awaitable]) -> None:
while True:
await func()
await asyncio.sleep(interval)
Periodic task
Task that is running periodically in the background until cancelled. Can be used as a context manager. There is no guarantee that the time between calls is strictly the interval if the function takes more time than the interval to execute.
Example usage:
from typing import Callable, Awaitable
from concurrent_tasks import PeriodicTask
class HeartBeat(PeriodicTask):
def __init__(self, interval: float, func: Callable[[], Awaitable]):
super().__init__(interval, func)
Thread safe task pool
The goal is to be able to safely run tasks from other threads.
Parameters:
size
can be a positive integer to limit the number of tasks concurrently running.timeout
can be set to define a maximum running time for each time after which it will be cancelled. Note: this excludes time spent waiting to be started (time spent in the buffer).
Example usage:
from concurrent_tasks import ThreadSafeTaskPool
async def func():
...
async with ThreadSafeTaskPool() as pool:
# Create and run the task.
result = await pool.run(func())
# Create a task, the `concurrent.Future` will hold information about completion.
future = pool.create_task(func())
Threaded task pool
Run async tasks in a dedicated thread. It will have its own event loop.
Under the hook, ThreadSafeTaskPool
is used.
Parameters:
name
will be used as the thread's name.size
andtimeout
seeThreadSafeTaskPool
.context_manager
can be optional context managers that will be entered when the loop has started and exited before the loop is stopped.
💡 All tasks will be completed when the pool is stopped.
💡 Blocking and async version are the same, prefer the async version if client code is async.
Loop initialization
⚠️ Asyncio primitives need to be instantiated with the proper event loop.
To achieve that, use a context manager wrapping instantiation of objects:
from functools import partial
from concurrent_tasks import ThreadedPoolContextManagerWrapper, AsyncThreadedTaskPool
pool = AsyncThreadedTaskPool(context_manager=ThreadedPoolContextManagerWrapper(partial(MyObj, ...)))
Blocking
This can be used to run async functions in a dedicated event loop, while keeping it running to handle background tasks
Example usage:
from concurrent_tasks import BlockingThreadedTaskPool
async def func():
...
with BlockingThreadedTaskPool() as pool:
# Create and run the task.
result = pool.run(func())
# Create a task, the `concurrent.Future` will hold information about completion.
future = pool.create_task(func())
Async
Threads can be useful in cooperation with asyncio to let the OS guarantee fair resource distribution between threads. This is especially useful in case you cannot know if called code will properly cooperate with the event loop.
Example usage:
from concurrent_tasks import AsyncThreadedTaskPool
async def func():
...
async with AsyncThreadedTaskPool() as pool:
# Create and run the task.
result = await pool.run(func())
# Create a task, the asyncio.Future will hold information about completion.
future = pool.create_task(func())
Restartable task
Task that can be started and cancelled multiple times until it can finally be completed. This is useful to handle pauses and retries when handling with a connection.
💡 Use
functools.partial
to pass parameters to the function.
Example usage:
from functools import partial
from concurrent_tasks import RestartableTask
async def send(data): ...
task: RestartableTask[int] = RestartableTask(partial(send, b"\x00"), timeout=1)
task.start()
assert await task == 1
# Running in other tasks:
# On connection lost:
task.cancel()
# On connection resumed:
task.start()
# On response received:
task.set_result(1)
Loop exception handler
Shut down process when an unhandled exception is caught or a signal is received.
To make this a graceful stop, pass a stop_func
.
When creating multiple background tasks, exceptions raised within those will be forwarded directly to the event loop.
In order to act on those exceptions, we need to use loop.set_exception_handler
.
💡 When a signal is received and the process is already shutting down, it will be force killed.
Example minimalistic implementation:
import asyncio
from concurrent_tasks import LoopExceptionHandler
async def run():
event = asyncio.Event()
tasks = []
async def _stop():
await asyncio.gather(*tasks)
event.set()
async with LoopExceptionHandler(_stop):
# Adding a bunch of tasks here...
await event.wait()
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file concurrent_tasks-1.8.1.tar.gz
.
File metadata
- Download URL: concurrent_tasks-1.8.1.tar.gz
- Upload date:
- Size: 10.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.4 CPython/3.13.0 Linux/6.5.0-1025-azure
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | c0ddaacba2ca90eec25b43f50ebfdda5dad8ef87013d3ee3a493a70c320704a5 |
|
MD5 | 90f0a6a961443f25434d9006d9f00f39 |
|
BLAKE2b-256 | 10b08342c0903cbfa30ac30a4173f121ef242cdade15d8ad370168c90f7b62f8 |
File details
Details for the file concurrent_tasks-1.8.1-py3-none-any.whl
.
File metadata
- Download URL: concurrent_tasks-1.8.1-py3-none-any.whl
- Upload date:
- Size: 13.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.4 CPython/3.13.0 Linux/6.5.0-1025-azure
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | a33031e0e72353a64f1178f5306f551a1a0f2205c22a0cd5f392416628725579 |
|
MD5 | 3584a46c1a11553199c0d659f1f86a25 |
|
BLAKE2b-256 | 09eeab22d9444692a8130af10cc175cf00766e80f182da30383261d336fa50a4 |