Stupid simple background worker based on python.
Project description
kerground
Stupid simple background worker based on python. Kerground
is super basic - no need for Redis/RabbitMQ. It does not utilize to much RAM because events are saved in plain old json files.
Quickstart
Install
pip install kerground
In your app folder create a new package called dependencies
(you can add this in utils
or whatever you consider fit otherwise)
#app/dependencies/kerground.py
from kerground import Kerground
ker = Kerground()
You can set on Kerground
the following params:
tasks_path
- path where theevents
will be saved by default in "./.kergroundtasks";pool
- wait in seconds for pending tasks;
Next register
your background workers like:
#some_module_.py
from app.dependencies import ker
@ker.register(ker.MODE.THREAD, max_retries=3)
def convert_files(event: list[str]):
pass # some heavy duty stuff here
The event
must be json serializable!
There are 3 mode available:
ker.MODE.THREAD
- distribute events withthreading.Thread
if you have urls to wait;ker.MODE.PROCESS
- (default) distribute events withmultiprocessing.Process
if you have some CPU intensive tasks;ker.MODE.SYNC
- distribute events one by one for the func to process;
By default max_retries
is 0
you can increase this number if you need to get data from some urls and there is a posibility they will fail.
Now you can send an event to background worker (kerground) like:
#some_other_module_possible_route_handler.py
from app.dependencies import ker
def send_files_for_conversion(files: List[UploadFile]):
filepaths = [file.filename for file in files]
msgid = ker.enqueue("convert_files", filepaths)
return f"Files were sent for conversion with id: {msgid}"
Pass to ker.enqueue
the function name you want to call in background along with the json parsable *args and **kwargs. Function ker.enqueue
will return an id which you can later inspect for it's status with ker.check_status(msgid)
.
Prepare the worker.py
file:
# ./worker.py
from app.dependencies import ker
if __name__ == "__main__":
ker.listen()
You can check the example
folder which was used for tests.
Multiple workers?
Just start multiple instances of worker.py
. You can use honcho
to make this easier.
# Procfile
web: python main.py
worker1: python worker.py
worker2: python worker.py
etc
Then:
honcho start
Submit any questions/issues you have! Fell free to fork it and improve it!
Roadmap
- Kerground Dashboard - to show running/pending/failed/done tasks, retrigger running a failed task + maybe some nice vizualizations;
- Benchmarks with Celery;
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for kerground-0.1.0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | e1a468f65e228bd4801a819a45c1c11a0beba627b2984ed1df9697d6a41b0a01 |
|
MD5 | ae32188cff5bc49e2d89487ac98171b5 |
|
BLAKE2b-256 | af0425b8feb2a636636ab3775e11f7089c89bf293a81ce4e77240882bef188e5 |