Skip to main content

Allows for files to be uploaded locally and then transferred to a remote location. This is a fork because the original is not online.

Project description


This storage backend enables having a local and a remote storage
backend. It will save any file locally and queue a task to transfer it
somewhere else.

If the file is accessed before it's transferred, the local copy is

The default tasks use `Celery <>`_ for queing
transfer tasks but is agnostic about your choice.



pip install django-queued-storage


- Follow the configuration instructions for
`django-celery <>`_
- Set up a `caching
backend <>`_
- Add ``queued_storage`` to your ``INSTALLED_APPS`` tuple


The ``QueuedRemoteStorage`` can be used as a drop-in replacement
wherever ```` might otherwise be

This example is using
`django-storages <>`_ for the
remote backend:


from django.db import models
from import QueuedRemoteStorage

class MyModel(models.Model):
image = ImageField(storage = QueuedRemoteStorage(
local = '',
remote = 'storages.backends.s3boto.S3BotoStorage'))


- ``queued_storage.backend.QueuedRemoteStorage``:
Base class for queued storages. You can use this to specify your own

- ``queued_storage.backend.DoubleFilesystemStorage``:
Used for testing, but can be handy if you want uploaded files to be
stored in two places. Example:


image = ImageField(
storage = DoubleFilesystemStorage(
local_kwargs = {'location': '/backup'},
remote_kwargs = {'location': settings.MEDIA_ROOT}))

- ``queued_storage.backend.S3Storage``:
Shortcut for the above example.


image = ImageField(storage = S3Storage())

- ``queued_storage.backend.DelayedStorage``:
This backend does *not* transfer files to the remote location


image = ImageField(storage = DelayedStorage(

>>> m = MyModel(image = File(open('image.png')))
>>> # Save locally:
>>> # Transfer to remote location:

Useful if you want to do preprocessing


- ``queued_storage.backend.RemoteFileField``:
Tiny wrapper around any ``QueuedRemoteStorage``, provides a
convenient method to transfer files. The above ``DelayedStorage``
example would look like this:


image = RemoteFileField(storage = DelayedStorage(

>>> m = MyModel(image = File(open('image.png')))
>>> # Save locally:
>>> # Transfer to remote location:
>>> m.file.transfer()


- ``queued_storage.backend.Transfer``:
The default task. Transfers to a remote location. The actual
transfer is implemented in the remote backend.

- ``queued_storage.backend.TransferAndDelete``:
Once the file was transferred, the local copy is deleted.

To create new tasks, do something like this:


from celery.registry import tasks
from queued_storage.backend import Transfer

class TransferAndDelete(Transfer):
def transfer(self, name, local, remote, **kwargs):
result = super(TransferAndDelete, self).transfer(name, local, remote, **kwargs)

if result:

return result


The result is ``True`` if the transfer was successful, else ``False``
and the task is retried.

In case you don't want to use Celery, have a look
`here <>`_.

To use a different task, pass it into the backend:


image = models.ImageField(storage = S3Storage(task = TransferAndDelete))


Use a different key for caching.

How many retries should be attempted before aborting.

The delay between retries.



- Added tests
- Added ``S3Storage`` and ``DelayedStorage``
- Added ``TransferAndDelete`` task
- Classes renamed to be consistent

Project details

Release history Release notifications | RSS feed

This version


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

django-queued-storage-fork-0.3.tar.gz (37.2 kB view hashes)

Uploaded Source

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page