Skip to main content

Upload large files to Django REST Framework in multiple chunks, with the ability to resume if the upload is interrupted.

Project description

This simple django app enables users to upload large files to Django Rest Framework in multiple chunks, with the ability to resume if the upload is interrupted.

This app is based to a large degree on the work of Julio Malegria, specifically his django-chunked-upload app.

License: MIT-Zero.

Installation

Install via pip:

pip install drf-chunked-upload

And then add it to your Django INSTALLED_APPS:

INSTALLED_APPS = (
    # ...
    'drf_chunked_upload',
)

Typical usage

  1. An initial PUT request is sent to the url linked to ChunkedUploadView (or any subclass) with the first chunk of the file. The name of the chunk file can be overriden in the view (class attribute field_name). Example:
{"my_file": file}
  1. In return, the server will respond with the url of the upload, the current offset, and when the upload will expire (expires). Example:
{
    "url": "https://your-host/<path_to_view>/5230ec1f59d1485d9d7974b853802e31",
    "offset": 10000,
    "expires": "2013-07-18T17:56:22.186Z"
}
  1. Repeatedly PUT subsequent chunks to the url returned from the server. Example:
# PUT to https://your-host/<path_to_view>/5230ec1f59d1485d9d7974b853802e31

{
    "my_file": file
}
  1. Server will continue responding with the url, current offset and expiration (expires).
  2. Finally, when upload is completed, POST a request to the returned url. This request must include the md5 checksum (hex) of the entire file. Example:
# POST to https://your-host/<path_to_view>/5230ec1f59d1485d9d7974b853802e31

{
    "md5": "fc3ff98e8c6a0d3087d515c0473f8677"
}
  1. If everything is OK, server will response with status code 200 and the data returned in the method get_response_data (if any).
  2. If you want to upload a file as a single chunk, this is also possible! Simply make the first request a POST and include the md5 for the file. You don’t need to include the Content-Range header if uploading a whole file.

Possible error responses:

  • Upload has expired. Server responds 410 (Gone).
  • id does not match any upload. Server responds 404 (Not found).
  • No chunk file is found in the indicated key. Server responds 400 (Bad request).
  • Request does not contain Content-Range header. Server responds 400 (Bad request).
  • Size of file exceeds limit (if specified). Server responds 400 (Bad request).
  • Offsets does not match. Server responds 400 (Bad request).
  • md5 checksums does not match. Server responds 400 (Bad request).

Settings

Add any of these variables into your project settings to override them.

DRF_CHUNKED_UPLOAD_EXPIRATION_DELTA

  • How long after creation the upload will expire.
  • Default: datetime.timedelta(days=1)

DRF_CHUNKED_UPLOAD_PATH

  • Path where uploaded files will be stored.
  • Default: 'chunked_uploads/%Y/%m/%d'

DRF_CHUNKED_UPLOAD_COMPLETE_EXT

  • Extension to use for completed uploads. Uploads will be renamed using this extension on completion, unless this extension matched DRF_CHUNKED_UPLOAD_INCOMPLETE_EXT.
  • Default: '.done'

DRF_CHUNKED_UPLOAD_INCOMPLETE_EXT

  • Extension for in progress upload files.
  • Default: '.part'

DRF_CHUNKED_UPLOAD_STORAGE_CLASS

  • Storage system (should be a class)
  • Default: None (use default storage system)

DRF_CHUNKED_UPLOAD_USER_RESTRICED

  • Boolean that determines whether only the user who created an upload can view/continue an upload.
  • Default: True

DRF_CHUNKED_UPLOAD_ABSTRACT_MODEL

DRF_CHUNKED_UPLOAD_MAX_BYTES

  • Max amount of data (in bytes) that can be uploaded. None means no limit.
  • Default: None

Support

If you find any bug or you want to propose a new feature, please use the issues tracker. Pull requests are also accepted.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Filename, size & hash SHA256 hash help File type Python version Upload date
drf-chunked-upload-0.4.2.tar.gz (13.6 kB) Copy SHA256 hash SHA256 Source None

Supported by

Elastic Elastic Search Pingdom Pingdom Monitoring Google Google BigQuery Sentry Sentry Error logging AWS AWS Cloud computing DataDog DataDog Monitoring Fastly Fastly CDN SignalFx SignalFx Supporter DigiCert DigiCert EV certificate StatusPage StatusPage Status page