Skip to main content

Duplicity backup to S3 for production servers using simple yaml config file.

Project description

Duplicity Backup to S3

image

image

image

Duplicity backup to S3 for production servers using simple yaml file.

License

Free software: Apache Software License 2.0

Features

This is a duplicity command line backup wrapper that will backup to S3 that is using a validated yaml configuration file using modern and awesome CLI patterns. The commands incremental, list files, status, verify, cleanup, remove and init are implemented.

The primary use case to build this (yet another one) CLI wrapper for duplicity, is to be able to deploy the command in production and inject it into a cron.daily and having a hands-off automated backup of production servers. We use it a KE-works to automate our production server backups to Amazon S3.

OS Dependencies

Installation

You can either install this as a system command on any operating system supporting Python 3.5 or later.

To install as global command from PyPI:

sudo python3 -m pip install duplicity-backup-s3

To install for your user only from PyPI enter the following commmand:

python3 -m pip install --user duplicity-backup-s3

To install from the git repository (latest master branch):

python3 -m pip install --user git+https://github.com/jberends/duplicity_backup.git#wheel=duplicity_backup_s3

First use

To first use, you need to create a configuration yaml file. You can use the helper command init for that. Use the built-in help function for your enjoyment.

# help is neigh
duplicity_backup_s3 --help

# and to init the configuration YAML file
duplicity_backup_s3 init

It will drop you a duplicity_backup_s3.yaml in your current directory. That may look like this:

aws:
  AWS_ACCESS_KEY_ID: foobar_aws_key_id  # your amazon S3 user that has write right to a backup bucket
  AWS_SECRET_ACCESS_KEY: foobar_aws_access_key  # your amazon S3 user secret
backuproot: /home  # the backup 'root' path. Everything underneath is considered for backup.
excludes:
  - "**"  # a list of exclude paths. May be '**' to exclude everything except what you include
includes:
  - /home/Pictures  # a list of includes, which are full paths
  - /home/Music
remote:
  bucket: '<an_s3_bucket>'  # S3 bucket name
  path: '__test'  # subpath within the bucket
full_if_older_than: 7D  # default is incremental, will create full backup every 7Days.

You can alter the configuration file to your liking. The command will check the configuration for its validity and tell you what went wrong and what you need to correct. If you made mistakes, it can be beneficial to checkout the duplicity man page for more information on that topic. However we tried to be as verbose as possible to guide you in the right direction.

First backup

To perform your first backup, which is a full one, use the following command:

duplicity_backup_s3 incr --verbose

# or if the config is somewhere else
duplicity_backup_s3 incr --config /path/to/configuation.yaml

That might take time according to the size of the backup. You can see the volumes being uploaded to your configured S3 bucket using the S3 console.

To check the backup collection, list and verify the contents of the backup you may use:

# collection status
duplicity_backup_s3 status

# list all files
duplicity_backup_s3 list

# verify backup
duplicity_backup_s3 verify

Remove old backups

To remove older backups, duplicity provides some commands. We implemented those in the remove command.

# to remove backups older than 7D
duplicity_backup_s3 remove --older-than 7D

# to remove older backup except the last 4 full backups
duplicity_backup_s3 remove --all-but-n-full 4

Restore backups

To restore backup we implemented the restore command.

# to restore backups from yesterday to a current directory
duplicity_backup_s3 restore --time 1D

# to restore specific subdirectory from a specific date/time to a custom directory
duplicity_backup_s3 restore --dir specific_subdir \
    --time 2020-12-08T22:22:00+01:00 --target ~/a_restoredir

Using this as daily backup in a cronjob

To use this in a daily cron job, you can alter the crontab for the user root

crontab -u root -e

You can alter the crontab in the following way

# Daily backup and remove older backup
7 4 * * * /bin/duplicity_backup_s3 incr --config=/path/to/conf.yaml && /bin/duplicity_backup_s3 remove --older-than 7D --config=/path/to/conf.yaml
# | | | | +- the command to execute
# | | | +--- day of the week (0-6) Sunday=0 (*=every day)
# | | +----- month of the year (*=every month)
# | +------- day of the month (*=every day)
# +--------- hour of the day
#----------- minute in the hour

Custom Endpoints

You can configure custom endpoint and custom additional arguments in the configuration yaml file. The custom endpoint can be configured in the section remote > uri and the additional arguments that are directly passed to duplicity can be configurated in the extra_args section as a list in the yaml file.

...
remote:
    uri: "s3://ams3.digitaloceanspaces.com/bucketname/subpath"
extra_args:
    - --some
    - --additional
    - --arguments
    - --here=3

TODO

  • implement appdirs for default configuration file placement
  • implement restore for restoring
  • Allow for custom s3 storage endpoints. Included in v1.2.0 with thanks to @denismatveev
  • If requested migrate --s3-european-buckets to configuration file
  • If requested implement GPG/Encryption capabilities. Possibly reusing code of kecpkg-tools to manage certificates. Included in v1.2.0 with thanks at @denismatveev

Credits

  • This package was inspired by the great work done by the duplicity team, back in the days.
  • This package was inspired by the great amount of bash code by the duplicity_backup.sh project.
  • This package is thankful on my knees to the great work done by the Authors and contributors behind the Click project, packing tons of CLI awesomeness since 2014.
  • This package was created with Cookiecutter and the audreyr/cookiecutter-pypackage project template. \n\n# Changelog

v1.2.1 (31JAN23)

  • :+1: Reinstating python 3.6 compatibility to ensure make it compativle with older Centos 7/RHEL installations that ship python 3.6.

v1.2.0 (23JAN23)

This release is named "The Other S3 As Well"-Release. We now support other S3 storage providers, other than only the Amazon one. So now you can use digitalocean (tested) or even dropbox for that matter as a remote storage for your backups.

  • :star: Added the option extra_args in the configuration yaml. These extra_args may be provided as a list in the yaml and they are passed down to duplicity. See the duplicity documentation for the exact use of them. (#8)
  • :star: Added the option to provide the full remote uri in the yaml section remote > uri. This will circumvent the construction of the uri based on endpoint, bucket and path and ensures that when you know what you are doing a lot of different storage providers may be used. See the duplicity help documentation how to provide the remote uri correctly. (#8)
  • :star: Added GPG encryption possibilities. Thanks to @denismatveev. (#5)
  • :star: Added the ability to use custom S3 endpoints. Now it does not default to amazon S3 and you can use S3 compatible storage targets such as digitalocean (tested on Digital Ocean). Thanks to @denismatveev. (#5)
  • :+1: Improved init command to add a flag --quiet for less chatty way of initialisation of the config file. (#8)
  • :+1: Thorough spring cleanup. Dropped support for Python releases below 3.7. Black-ened, pyupgrade-ed, isort-ed the codebase. Fixed tests and added tests for newer python releases. Switched to dependabot for dependency management. (#8)

v1.1.0 (8DEC20)

  • added restore command implementation.

v1.0.2 (3APR20)

  • Added appdirs to the setup.py requirements.

v1.0.1 (UNRELEASED)

Not released to the public.

v1.0.0 (16DEC19)

First production release.

  • implemented appdirs, such that the configuration file can be safely placed and located from a known configuration directory on disk.
  • added remove command to remove collections from the backup target after a specified time. Please consult the duplicity_backup_s3 remove --help documentation for guidance.
  • added init command to initialise the configuration in an interactive fashion for users.
  • various development improvements, e.g. GitHub actions for testing and publishing to PyPI; removed all flake8 warnings and pydocstyle errors. Added pre-commit hooks. Code is A++ grade now.

v0.5.0 (5JUN19)

First initial public release.

  • commands incr, list, status, verify, cleanup implemented.
  • added yaml schema check for the configuration file.

v0.2.0 (3JUN19)

Internal release.

  • migrated to command structure. Now offers incr and init

v0.1.0 (3JUN19)

Internal release.

  • First release on PyPI.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

duplicity_backup_s3-1.2.1.tar.gz (25.4 kB view details)

Uploaded Source

Built Distribution

duplicity_backup_s3-1.2.1-py2.py3-none-any.whl (25.0 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file duplicity_backup_s3-1.2.1.tar.gz.

File metadata

  • Download URL: duplicity_backup_s3-1.2.1.tar.gz
  • Upload date:
  • Size: 25.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.9

File hashes

Hashes for duplicity_backup_s3-1.2.1.tar.gz
Algorithm Hash digest
SHA256 d0afeca49334e4824cfe716526c31391abbd282f53a96357d59f2a1960276370
MD5 82bf6f29ab97de7ba3c6d7588941d74e
BLAKE2b-256 3a090942004a1f2323a693ee1ea2bdad948b22df0a0b1d79c83b185d37df5b25

See more details on using hashes here.

File details

Details for the file duplicity_backup_s3-1.2.1-py2.py3-none-any.whl.

File metadata

File hashes

Hashes for duplicity_backup_s3-1.2.1-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 c57881ca7e1656efe6f8c3c9cf5169ee647f05ef9817a9eefebbd50b253c57b0
MD5 0de8ba4cf90feb1137ac0179e5070fef
BLAKE2b-256 bc997b976ab70e482c46310764c7f4427251f6db73a95b047e0865d4ed4501c4

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page