Skip to main content

Easily deploy to S3 with multiple environment support.

Project description

d3ploy

Easily deploy to S3 with multiple environment support. Version 4 supports Python 3.7+.

Installation & Usage

To install, run pip install d3ploy. To use, run d3ploy. Additional arguments may be specified. Run d3ploy --help for more information. If you're using the excellent uv you can run uvx d3ploy without needing to install d3ploy.

Authentication

Your AWS credentials can be set in a number of ways:

  1. In a ".boto" file in your home folder. See Boto's documentation for how to create this file.
  2. In a ".aws" file in the folder you're running d3ploy in. Follows the same format as ".boto".
  3. In the environment variables "AWS_ACCESS_KEY_ID" and "AWS_SECRET_ACCESS_KEY".

Configuration options

When you run d3ploy, it will look in the current directory for a ".d3ploy.json" file that defines the different deploy enviroments and their options. At a minimum, a "default" environment is required and is the environment used if you pass no arguments to d3ploy. Additionally, you may pass in a different path for you config file with the -c or --config options.

To supress all output, pass -q or --quiet to the command. Note that there is not a way to set the quiet option in the config file(s).

To set the number of separate processes to use, pass -p 10 or --processess 10 where '10' is the number to use. If you do not want to use multiple processes, set this to '0'.

You can add as many environments as needed. Deploy to an environment by passing in its key like d3ploy staging. As of version 3.0, environments no longer inherit settings from the default environment. Instead, a separate defaults object in the config file can be used to set options across all environments.

The only required option for any environment is "bucket_name" for the S3 bucket to upload to. Additionally, you may define:

  • "local_path" to upload only the contents of a directory under the current one; defaults to "." (current directory)
  • "bucket_path" to upload to a subfolder in the bucket; defaults to "/" (root)
  • "exclude" to specify patterns to not upload
  • "acl" to specify the canned ACL set on each object. See the AWS docs for more.
  • "delete" to remove files on S3 that are not present in the local directory
  • "charset" to set the charset flag on 'Content-Type' headers of text files
  • "caches" to set the Cache-Control header for various mimetypes. See below for more.
  • "gitignore" to add all entries in a .gitignore file to the exclude patterns
  • "cloudfront_id" to invalidate all paths in the given CloudFront distribution IDs. Can be a string for one distribution or an array for multiple.

Example .d3ploy.json

{
  "environments": {
    "default": {
      "bucket_name": "d3ploy-tests",
      "local_path": "./tests/files",
      "bucket_path": "/default/"
    },
    "staging": {
      "bucket_name": "d3ploy-tests",
      "local_path": "./tests/files",
      "bucket_path": "/staging/"
    }
  },
  "defaults": {
    "caches": {
      "text/javascript": 2592000,
      "image/gif": 22896000,
      "image/jpeg": 22896000,
      "image/png": 22896000,
      "image/webp": 22896000,
      "text/*": 2592000,
      "text/html": 0,
      "text/plain": 0
    }
  }
}

Cache-Control Headers

If you want to set Cache-Control headers on various files, add a caches object to your config file like:

"caches": {
  "text/javascript": 2592000,
  "image/gif": 22896000,
  "image/jpeg": 22896000,
  "image/png": 22896000,
  "image/webp": 22896000,
  "text/*": 2592000,
  "text/html": 0,
  "text/plain": 0
}

Each key is the mimetype of the kind of file you want to have cached, with a value that is the seconds the max-age flag set to. In the above example, CSS and JavaScript files will be cached for 30 days, images will be cached for 1 year, and html files will not be cached. For more about Cache-Control, read Leverage Browser Caching. You may use wildcards like image/* to apply to all images. If there's a more specific match for a particular image type, that will override the wildcard. For example:

"caches": {
  "image/png": 300,
  "image/*": 31536000
}

In this case JPGs, GIFs and all other images except for PNGs will be cached for 1 year. PNGs, however, will be cached for 5 minutes.

Progress Bar

d3ploy uses the tqdm module to display output when --quiet is not set.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

d3ploy-4.4.2.tar.gz (5.8 MB view details)

Uploaded Source

Built Distribution

d3ploy-4.4.2-py3-none-any.whl (11.3 kB view details)

Uploaded Python 3

File details

Details for the file d3ploy-4.4.2.tar.gz.

File metadata

  • Download URL: d3ploy-4.4.2.tar.gz
  • Upload date:
  • Size: 5.8 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for d3ploy-4.4.2.tar.gz
Algorithm Hash digest
SHA256 de35da05a77094316accaf96fb9d8687d6d3cf0f81a75e493f5f835f0308cde4
MD5 7c306b221644b9bccfd8a4d79f92b00c
BLAKE2b-256 cde55c14366c59145d14181ff697e5514f171b624de935e89386f6815672eb89

See more details on using hashes here.

Provenance

The following attestation bundles were made for d3ploy-4.4.2.tar.gz:

Publisher: release_new_tags.yml on dryan/d3ploy

Attestations:

File details

Details for the file d3ploy-4.4.2-py3-none-any.whl.

File metadata

  • Download URL: d3ploy-4.4.2-py3-none-any.whl
  • Upload date:
  • Size: 11.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for d3ploy-4.4.2-py3-none-any.whl
Algorithm Hash digest
SHA256 33924dce72c6f1f4e68e271cd4797141386934a56607e8c2e93225cdf5773e7c
MD5 aeb79cecf4a8f442361ad01c018d1bfd
BLAKE2b-256 58f9582b2fcf5e471d0bd250e451a0e3570bed20ba40b6305a367eddc036d525

See more details on using hashes here.

Provenance

The following attestation bundles were made for d3ploy-4.4.2-py3-none-any.whl:

Publisher: release_new_tags.yml on dryan/d3ploy

Attestations:

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page