Skip to main content

A tool for copying build folders to S3 with guessed Content-Type and Cache-Control

Project description

s3autocp

s3autocp is a Python script designed to automate the process of copying local directories to Amazon S3 with appropriate Content-Type headers and optional compression. This script is especially useful for deploying static assets, applying suitable MIME types, and ensuring efficient transfer and storage with Brotli and Gzip compression.

Features

  • Content-Type Guessing: Automatically determines the Content-Type for files based on their extensions.
  • Compression: Compresses eligible files using Brotli and Gzip for optimized storage and transfer.
  • S3 Upload: Efficiently uploads files to a specified S3 bucket, setting appropriate headers like Cache-Control.
  • Command-Line Interface: Easy-to-use CLI for specifying source directory and destination S3 URL.

Requirements

  • Python 3.9
  • boto3 library
  • brotli library
  • AWS credentials configured (typically via environment variables or AWS CLI)

Installation

Make sure you have Python 3 installed. Then, install the required dependencies:

pip install s3autocp

Development

pip install boto3 brotli

Usage

s3autocp [-c/--compress] <source_directory> <destination_s3_url>

  • -c/--compress: Enable compression for appropriate file types
  • <source_directory>: The local directory you wish to copy to S3.
  • <destination_s3_url>: The S3 URL where files will be uploaded, in the format s3://bucket-name/path.

Example

s3autocp ./my-local-dir s3://my-bucket/my-path

This command will copy all files from ./my-local-dir to the S3 bucket my-bucket under the my-path directory.

How It Works

  1. The script scans the source directory recursively for files.
  2. Determines the MIME type for each file.
  3. If you are using the -c/--compress-flag, compresses eligible files using Brotli and Gzip.
  4. Uploads files to the specified S3 bucket with appropriate Content-Type and Cache-Control headers.

Contributing

Contributions, issues, and feature requests are welcome. Feel free to check issues page if you want to contribute.

License

Distributed under the Apache License. See LICENSE for more information.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

s3autocp-0.4.0.tar.gz (10.2 kB view details)

Uploaded Source

Built Distribution

s3autocp-0.4.0-py3-none-any.whl (10.5 kB view details)

Uploaded Python 3

File details

Details for the file s3autocp-0.4.0.tar.gz.

File metadata

  • Download URL: s3autocp-0.4.0.tar.gz
  • Upload date:
  • Size: 10.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.4

File hashes

Hashes for s3autocp-0.4.0.tar.gz
Algorithm Hash digest
SHA256 001c09d2560ba8dd06dd62514108a9b456372e21b41fdc777606b73b21721005
MD5 11740bf1b987f7d5646b1eb94f2ba5ce
BLAKE2b-256 4de1753aa007f0df7cb9b0e723c314190748c572389061474a5a0c234183f250

See more details on using hashes here.

File details

Details for the file s3autocp-0.4.0-py3-none-any.whl.

File metadata

  • Download URL: s3autocp-0.4.0-py3-none-any.whl
  • Upload date:
  • Size: 10.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.4

File hashes

Hashes for s3autocp-0.4.0-py3-none-any.whl
Algorithm Hash digest
SHA256 2889ab932e398dc4e28e5abb9acfed63e022aca676881c90232d5ea552f11c96
MD5 2e6a8a02209f32d8117bee6774fdf516
BLAKE2b-256 23fc804fd3634728bdfe5ed7d619e0e240d023d446f623414404f893164af3f6

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page