Distributed Task Queue.
Project description
- Version:
4.2.1 (windowlicker)
- Web:
- Download:
- Source:
- Keywords:
task, queue, job, async, rabbitmq, amqp, redis, python, distributed, actors
Donations
This project relies on your generous donations.
If you are using Celery to create a commercial product, please consider becoming our backer or our sponsor to ensure Celery’s future.
Sponsors
What’s a Task Queue?
Task queues are used as a mechanism to distribute work across threads or machines.
A task queue’s input is a unit of work, called a task, dedicated worker processes then constantly monitor the queue for new work to perform.
Celery communicates via messages, usually using a broker to mediate between clients and workers. To initiate a task a client puts a message on the queue, the broker then delivers the message to a worker.
A Celery system can consist of multiple workers and brokers, giving way to high availability and horizontal scaling.
Celery is written in Python, but the protocol can be implemented in any language. In addition to Python there’s node-celery for Node.js, and a PHP client.
Language interoperability can also be achieved by using webhooks in such a way that the client enqueues an URL to be requested by a worker.
What do I need?
Celery version 4.2 runs on,
Python (2.7, 3.4, 3.5, 3.6)
PyPy (6.0)
This is the last version to support Python 2.7, and from the next version (Celery 5.x) Python 3.5 or newer is required.
If you’re running an older version of Python, you need to be running an older version of Celery:
Python 2.6: Celery series 3.1 or earlier.
Python 2.5: Celery series 3.0 or earlier.
Python 2.4 was Celery series 2.2 or earlier.
Celery is a project with minimal funding, so we don’t support Microsoft Windows. Please don’t open any issues related to that platform.
Celery is usually used with a message broker to send and receive messages. The RabbitMQ, Redis transports are feature complete, but there’s also experimental support for a myriad of other solutions, including using SQLite for local development.
Celery can run on a single machine, on multiple machines, or even across datacenters.
Get Started
If this is the first time you’re trying to use Celery, or you’re new to Celery 4.2 coming from previous versions then you should read our getting started tutorials:
-
Tutorial teaching you the bare minimum needed to get started with Celery.
-
A more complete overview, showing more features.
Celery is…
Simple
Celery is easy to use and maintain, and does not need configuration files.
It has an active, friendly community you can talk to for support, like at our mailing-list, or the IRC channel.
Here’s one of the simplest applications you can make:
from celery import Celery app = Celery('hello', broker='amqp://guest@localhost//') @app.task def hello(): return 'hello world'
Highly Available
Workers and clients will automatically retry in the event of connection loss or failure, and some brokers support HA in way of Primary/Primary or Primary/Replica replication.
Fast
A single Celery process can process millions of tasks a minute, with sub-millisecond round-trip latency (using RabbitMQ, py-librabbitmq, and optimized settings).
Flexible
Almost every part of Celery can be extended or used on its own, Custom pool implementations, serializers, compression schemes, logging, schedulers, consumers, producers, broker transports, and much more.
It supports…
Message Transports
Concurrency
Result Stores
AMQP, Redis
memcached
SQLAlchemy, Django ORM
Apache Cassandra, IronCache, Elasticsearch
Serialization
pickle, json, yaml, msgpack.
zlib, bzip2 compression.
Cryptographic message signing.
Framework Integration
Celery is easy to integrate with web frameworks, some of which even have integration packages:
not needed
not needed
The integration packages aren’t strictly necessary, but they can make development easier, and sometimes they add important hooks like closing database connections at fork.
Documentation
The latest documentation is hosted at Read The Docs, containing user guides, tutorials, and an API reference.
Installation
You can install Celery either via the Python Package Index (PyPI) or from source.
To install using pip:
$ pip install -U Celery
Bundles
Celery also defines a group of bundles that can be used to install Celery and the dependencies for a given feature.
You can specify these in your requirements or on the pip command-line by using brackets. Multiple bundles can be specified by separating them by commas.
$ pip install "celery[librabbitmq]" $ pip install "celery[librabbitmq,redis,auth,msgpack]"
The following bundles are available:
Serializers
- celery[auth]:
for using the auth security serializer.
- celery[msgpack]:
for using the msgpack serializer.
- celery[yaml]:
for using the yaml serializer.
Concurrency
- celery[eventlet]:
for using the eventlet pool.
- celery[gevent]:
for using the gevent pool.
Transports and Backends
- celery[librabbitmq]:
for using the librabbitmq C library.
- celery[redis]:
for using Redis as a message transport or as a result backend.
- celery[sqs]:
for using Amazon SQS as a message transport.
- celery[tblib]:
for using the task_remote_tracebacks feature.
- celery[memcache]:
for using Memcached as a result backend (using pylibmc)
- celery[pymemcache]:
for using Memcached as a result backend (pure-Python implementation).
- celery[cassandra]:
for using Apache Cassandra as a result backend with DataStax driver.
- celery[azureblockblob]:
for using Azure Storage as a result backend (using azure-storage)
- celery[s3]:
for using S3 Storage as a result backend.
- celery[couchbase]:
for using Couchbase as a result backend.
- celery[elasticsearch]:
for using Elasticsearch as a result backend.
- celery[riak]:
for using Riak as a result backend.
- celery[cosmosdbsql]:
for using Azure Cosmos DB as a result backend (using pydocumentdb)
- celery[zookeeper]:
for using Zookeeper as a message transport.
- celery[sqlalchemy]:
for using SQLAlchemy as a result backend (supported).
- celery[pyro]:
for using the Pyro4 message transport (experimental).
- celery[slmq]:
for using the SoftLayer Message Queue transport (experimental).
- celery[consul]:
for using the Consul.io Key/Value store as a message transport or result backend (experimental).
- celery[django]:
specifies the lowest version possible for Django support.
You should probably not use this in your requirements, it’s here for informational purposes only.
Downloading and installing from source
Download the latest version of Celery from PyPI:
https://pypi.org/project/celery/
You can install it by doing the following,:
$ tar xvfz celery-0.0.0.tar.gz $ cd celery-0.0.0 $ python setup.py build # python setup.py install
The last command must be executed as a privileged user if you aren’t currently using a virtualenv.
Using the development version
With pip
The Celery development version also requires the development versions of kombu, amqp, billiard, and vine.
You can install the latest snapshot of these using the following pip commands:
$ pip install https://github.com/celery/celery/zipball/master#egg=celery $ pip install https://github.com/celery/billiard/zipball/master#egg=billiard $ pip install https://github.com/celery/py-amqp/zipball/master#egg=amqp $ pip install https://github.com/celery/kombu/zipball/master#egg=kombu $ pip install https://github.com/celery/vine/zipball/master#egg=vine
With git
Please see the Contributing section.
Getting Help
Mailing list
For discussions about the usage, development, and future of Celery, please join the celery-users mailing list.
IRC
Come chat with us on IRC. The #celery channel is located at the Freenode network.
Bug tracker
If you have any suggestions, bug reports, or annoyances please report them to our issue tracker at https://github.com/celery/celery/issues/
Wiki
Credits
Contributors
This project exists thanks to all the people who contribute. Development of celery happens at GitHub: https://github.com/celery/celery
You’re highly encouraged to participate in the development of celery. If you don’t like GitHub (for some reason) you’re welcome to send regular patches.
Be sure to also read the Contributing to Celery section in the documentation.
Backers
Thank you to all our backers! 🙏 [Become a backer]
Sponsors
Support this project by becoming a sponsor. Your logo will show up here with a link to your website. [Become a sponsor]
License
This software is licensed under the New BSD License. See the LICENSE file in the top distribution directory for the full license text.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file celery_custom_fix-4.2.0.tar.gz
.
File metadata
- Download URL: celery_custom_fix-4.2.0.tar.gz
- Upload date:
- Size: 1.4 MB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/1.12.1 pkginfo/1.5.0 requests/2.21.0 setuptools/40.6.3 requests-toolbelt/0.8.0 tqdm/4.29.0 CPython/3.7.1rc1
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | a27daf936fe407358396fc8a7d24cb24864c690e428f82a7a45be3c6987c1ee5 |
|
MD5 | a19c4e656e58f46a0577a673b91400f5 |
|
BLAKE2b-256 | 79aef57b8e81b95b566be11c42d5d2873449bacbb4e94ca406e5948ec90bf8a2 |
File details
Details for the file celery_custom_fix-4.2.0-py2.py3-none-any.whl
.
File metadata
- Download URL: celery_custom_fix-4.2.0-py2.py3-none-any.whl
- Upload date:
- Size: 805.1 kB
- Tags: Python 2, Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/1.12.1 pkginfo/1.5.0 requests/2.21.0 setuptools/40.6.3 requests-toolbelt/0.8.0 tqdm/4.29.0 CPython/3.7.1rc1
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | c60f40ea4af017e533ccf0164636e080eb7980f379986321d470e945c8435582 |
|
MD5 | 13e6c40d50a89d372240a145647aa342 |
|
BLAKE2b-256 | 43d362b01b8618934f6f1886690574f8d9e1711e7cf0a2fbb4d89c9bdc4431fc |