Command-line program to enqueue tasks within RabbitMQ for processing by Celery
Project description
# Celery Enqueue
[Celery](http://www.celeryproject.org/) is a distributed task queue
for Python that uses [RabbitMQ](https://www.rabbitmq.com/) (or
[Redis](https://redis.io/)) for state.
The usual pattern in Celery is to have task implementations and the
code which enqueues/schedules tasks within the same application:
```python
# in tasks.py
def doit(arg):
...
```
```python
# in app.py
from tasks import *
result = doit.delay(123)
```
Sometimes it is useful to be able to split these functions across
totally different hosts/applications, using Celery's state (e.g. -
RabbitMQ) to connect them. Unfortunately, Celery doesn't make it as
easy as it could be to schedule the `doit` task without having the
`tasks.py` file available locally.
The `celery-enqueue` program included with this library makes this
easy.
# Installation
Via `pip`:
```
$ pip install celery-enqueue
```
Via source:
```
$ git clone https://github.com/unchained-capital/celery-enqueue
$ cd celery-enqueue
$ make
```
# Usage
## Command-Line
Assuming you installed via `pip`, the `celery-enqueue` command should
be installed. Try running it with the `-h` flag to see more details:
```
$ celery-enqueue -h
```
If you have a RabbitMQ server running locally at the default port with
no custom vhosts, users, or security, you can run:
```
$ celery-enqueue my_app.tasks.my_task arg1 arg2
```
to enqueue the task `my_app.tasks.my_task` with arguments `('arg1',
'arg2')` into the local RabbitMQ broker's `celery` queue. This should
be identical to having run `my_app.tasks.my_task.delay("arg1",
"arg2")` from within your application.
This behavior can be configured on the command-line as well as via a
configuration file.
## Python
Assuming that your `PYTHONPATH` is properly set up (this is handled
for you if you installed using `pip`), and you have a RabbitMQ server
running locally at the default port with no custom vhosts, users, or
security, you can run:
```python
from celery_enqueue import enqueue
enqueue("my_app.tasks.my_task", ["arg1", "arg2"])
```
to enqueue the task `my_app.tasks.my_task` with arguments `('arg1',
'arg2')` into the local RabbitMQ broker's `celery` queue. This should
be identical to having run `my_app.tasks.my_task.delay("arg1",
"arg2")` from within your application.
This behavior can be configured at runtime:
```python
from celery_enqueue import enqueue, set_config
set_config({'host': 'rabbitmq.internal'})
enqueue("my_app.tasks.my_task", ["arg1", "arg2"])
```
# Configuration
See `example/celery-enqueue.yml` in this repository for an example
configuration file you can copy and modify.
## RabbitMQ
Some configuration is needed to find your RabbitMQ server and to
ensure data is enqueued so your Celery tasks will find it.
By default, the scripts will attempt to connect to the vhost `/` on a
local RabbitMQ server on the default port (5672) with no
authentication.
The following configuration settings affect this default behavior:
* `user` -- the name of the RabbitMQ user
* `password` -- the password of the RabbitMQ user
* `host` -- the hostname or IP of the RabbitMQ broker
* `port` -- the port of the RabbitMQ broker
* `vhost` -- the RabbitMQ vhost used by Celery
* `queue` -- the RabbitMQ queue used by Celery
These settings can be provided on the command-line, via a
configuration file, or by calling `set_config`.
## Error handling
In case of an uncaught exception, the default behavior is for
`celery-enqueue` to print a Python stacktrace and exit with a nonzero
return code.
The following configuration settings affect this default behavior:
* `success` -- make `celery-enqueue` always exit successfully with a return code of 0
* `error_command` -- run this command. The following strings will be interpolated:
* `%e` -- the error message of the exception
* `%u` -- the (masked) URL of the RabbitMQ broker
* `%t` -- the name of the task
* `%a` -- the arguments to the task
(The `error_command` will only run if `success` is also set.)
A simple example, handled via a configuration file:
```yaml
# in config.yml
error_command: |
echo 'ERROR: Failed to enqueue task %t(%a) at broker %u. (%e)'
```
And invoked like this:
```
$ celery-enqueue -c config.yml my_app.tasks.my_task arg1 arg2
```
[Celery](http://www.celeryproject.org/) is a distributed task queue
for Python that uses [RabbitMQ](https://www.rabbitmq.com/) (or
[Redis](https://redis.io/)) for state.
The usual pattern in Celery is to have task implementations and the
code which enqueues/schedules tasks within the same application:
```python
# in tasks.py
def doit(arg):
...
```
```python
# in app.py
from tasks import *
result = doit.delay(123)
```
Sometimes it is useful to be able to split these functions across
totally different hosts/applications, using Celery's state (e.g. -
RabbitMQ) to connect them. Unfortunately, Celery doesn't make it as
easy as it could be to schedule the `doit` task without having the
`tasks.py` file available locally.
The `celery-enqueue` program included with this library makes this
easy.
# Installation
Via `pip`:
```
$ pip install celery-enqueue
```
Via source:
```
$ git clone https://github.com/unchained-capital/celery-enqueue
$ cd celery-enqueue
$ make
```
# Usage
## Command-Line
Assuming you installed via `pip`, the `celery-enqueue` command should
be installed. Try running it with the `-h` flag to see more details:
```
$ celery-enqueue -h
```
If you have a RabbitMQ server running locally at the default port with
no custom vhosts, users, or security, you can run:
```
$ celery-enqueue my_app.tasks.my_task arg1 arg2
```
to enqueue the task `my_app.tasks.my_task` with arguments `('arg1',
'arg2')` into the local RabbitMQ broker's `celery` queue. This should
be identical to having run `my_app.tasks.my_task.delay("arg1",
"arg2")` from within your application.
This behavior can be configured on the command-line as well as via a
configuration file.
## Python
Assuming that your `PYTHONPATH` is properly set up (this is handled
for you if you installed using `pip`), and you have a RabbitMQ server
running locally at the default port with no custom vhosts, users, or
security, you can run:
```python
from celery_enqueue import enqueue
enqueue("my_app.tasks.my_task", ["arg1", "arg2"])
```
to enqueue the task `my_app.tasks.my_task` with arguments `('arg1',
'arg2')` into the local RabbitMQ broker's `celery` queue. This should
be identical to having run `my_app.tasks.my_task.delay("arg1",
"arg2")` from within your application.
This behavior can be configured at runtime:
```python
from celery_enqueue import enqueue, set_config
set_config({'host': 'rabbitmq.internal'})
enqueue("my_app.tasks.my_task", ["arg1", "arg2"])
```
# Configuration
See `example/celery-enqueue.yml` in this repository for an example
configuration file you can copy and modify.
## RabbitMQ
Some configuration is needed to find your RabbitMQ server and to
ensure data is enqueued so your Celery tasks will find it.
By default, the scripts will attempt to connect to the vhost `/` on a
local RabbitMQ server on the default port (5672) with no
authentication.
The following configuration settings affect this default behavior:
* `user` -- the name of the RabbitMQ user
* `password` -- the password of the RabbitMQ user
* `host` -- the hostname or IP of the RabbitMQ broker
* `port` -- the port of the RabbitMQ broker
* `vhost` -- the RabbitMQ vhost used by Celery
* `queue` -- the RabbitMQ queue used by Celery
These settings can be provided on the command-line, via a
configuration file, or by calling `set_config`.
## Error handling
In case of an uncaught exception, the default behavior is for
`celery-enqueue` to print a Python stacktrace and exit with a nonzero
return code.
The following configuration settings affect this default behavior:
* `success` -- make `celery-enqueue` always exit successfully with a return code of 0
* `error_command` -- run this command. The following strings will be interpolated:
* `%e` -- the error message of the exception
* `%u` -- the (masked) URL of the RabbitMQ broker
* `%t` -- the name of the task
* `%a` -- the arguments to the task
(The `error_command` will only run if `success` is also set.)
A simple example, handled via a configuration file:
```yaml
# in config.yml
error_command: |
echo 'ERROR: Failed to enqueue task %t(%a) at broker %u. (%e)'
```
And invoked like this:
```
$ celery-enqueue -c config.yml my_app.tasks.my_task arg1 arg2
```
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
celery-enqueue-1.0.1.tar.gz
(6.6 kB
view details)
Built Distribution
File details
Details for the file celery-enqueue-1.0.1.tar.gz
.
File metadata
- Download URL: celery-enqueue-1.0.1.tar.gz
- Upload date:
- Size: 6.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | ba9ae92ac07cf8826c33e18811cace9e5e1fa9b15e78ff6ddce4c827e2b98d47 |
|
MD5 | d5dfd9d7c20e5c88aef08b9aec492d29 |
|
BLAKE2b-256 | bd05c24beb45c6b6279bab9d3e3b2600e6e6a53b8ba643a33bb7c2e2770a237e |
File details
Details for the file celery_enqueue-1.0.1-py2.py3-none-any.whl
.
File metadata
- Download URL: celery_enqueue-1.0.1-py2.py3-none-any.whl
- Upload date:
- Size: 9.4 kB
- Tags: Python 2, Python 3
- Uploaded using Trusted Publishing? No
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 90ac20bd51b194fc0eb742e815f3f79fd519920c71307709b31e67a6e57b5f34 |
|
MD5 | d8cac2c38f215d487132207a08b8726b |
|
BLAKE2b-256 | 0dda16eaeb73615c18e5e303d0dd116720d27a7f35c6e9229f213e60975b8ec5 |