Skip to main content

Upload directories to AWS S3

Project description


Codacy Badge Build Status PyPI version License: MIT

Upload whole directories or distinct files to AWS S3 using s3push in command line. Extensive support for different credential sources.

Project goal

This tool facilitates painless upload of directories to AWS S3. The initial goal was to provide developers with a simple tool that allows them to upload whole directories to S3 with a minimal effort. Initial project vision also contained an option to create and configure a fresh S3-hosted website from scratch.

However, the tool has been written after a very quick research and, as a result, suffered from the "Not invented here" syndrome. After a bit more careful research it was discovered that not only the other directory uploading tools exist, but also that there are far more superior instruments that also allow configuration of buckets and CloudFront CDN (i.e., s3cmd). See the rough feature set comparison in the docs.

As a project's post-mortem, it can be concluded that a proper research must be performed prior to the start of development. For example, before developing a new library or a tool it may be very useful to make a simple feature set comparison.

The project will, however, continue to exist as a demonstration of continuous delivery setup for a python package development. The environment for the project is designed to support fully automated releases and testing.


Recommended installation method is via Pipenv:

pipenv install s3push

Installing via Pip instead of Pipenv:

pip install s3push


Publish a directory with default credentials

s3push ~/my-website/

Publish a file using default credentials

s3push ~/my-website/index.html

Publish with provided key pair

s3push ~/my-website/ -k XXXXXXXXXXXXXXXXXXXX -s xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx

Publish with saved profile by providing its name

s3push ~/my-website/ -p my-deployment-profile

Note: see all possible options for specifying credentials below.

Priority of credentials providers

  1. Passing credentials as optional arguments: -k AWS_ACCESS_KEY_ID, -s AAWS_SECRET_ACCESS_KEY.
  2. Passing profile name of pre-configured credentials as optional argument: -p PROFILE_NAME.
  3. Environment variables, as listed in the boto3 guide.
  4. Default credentials in the shared credential file (~/.aws/credentials).
  5. Default credentials in the AWS config file (~/.aws/config).
  6. Boto2 config file (/etc/boto.cfg and ~/.boto).

Continuous Delivery

Branch Travis-CI Status CircleCI Status
master Build Status CircleCI
dev Build Status CircleCI

Project details

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Files for s3push, version 2019.10.14
Filename, size File type Python version Upload date Hashes
Filename, size s3push-2019.10.14-py3-none-any.whl (6.0 kB) File type Wheel Python version py3 Upload date Hashes View
Filename, size s3push-2019.10.14.tar.gz (5.1 kB) File type Source Python version None Upload date Hashes View

Supported by

Pingdom Pingdom Monitoring Google Google Object Storage and Download Analytics Sentry Sentry Error logging AWS AWS Cloud computing DataDog DataDog Monitoring Fastly Fastly CDN DigiCert DigiCert EV certificate StatusPage StatusPage Status page