Rsync like utility for backing up files/folders to AWS Glacier
Project description
AWS Glacier Rsync Like Utility
Rsync like utility to back up files and folders to AWS Glacier. Utility can compress files and store on Glacier. Archive ids will be stored in an sqlite database.
You have to log in to aws with aws cli and create a glacier vault beforehand.
Run params:
$ grsync --help
usage: grsync version 0.3.5 [-h] [--loglevel {CRITICAL,FATAL,ERROR,WARN,WARNING,INFO,DEBUG,NOTSET}] [--db db] --vault vault --region region [--compress COMPRESS] [--part-size PART_SIZE] [--desc desc] src
Rsync like glacier backup util
positional arguments:
src file or folder to generate archive from
optional arguments:
-h, --help show this help message and exit
--loglevel {CRITICAL,FATAL,ERROR,WARN,WARNING,INFO,DEBUG,NOTSET}
log level (default: INFO)
--db db database file to store sync info (default: glacier.db)
--vault vault Glacier vault name (default: None)
--region region Glacier region name (default: None)
--compress COMPRESS Enable compression. Only zstd is supported (default: False)
--part-size PART_SIZE
Part size for compression (default: 1048576)
--desc desc A description for the archive that will be stored in Amazon Glacier (default: None)
If compression is enabled, file will be read and compressed on the fly and uploaded to glacier multipart.
Sqlite database scheme:
CREATE TABLE
sync_history
(id integer primary key,
path text, /* full path of the backed up file */
file_size integer, /* size of the file */
mtime float, /* modification time */
archive_id text, /* archive id generated by glacier */
location text, /* archive url generated by glacier */
checksum text, /* checksum of the archive generated by glacier*/
compression text, /* compression algorithm used. NULL if none */
timestamp text /* backup timestamp */
);
Do not lose your database
Currently, there is no way to rebuild it from aws inventory.
Known Issues
- Glacier supports 1024 bytes of description, and I'm currently putting a description in this format:
grsync|abs_file_path|size|mtime|user_desc
Which is not posix compatible since there is no limit to the filename or full path. I can put a metadata in front of every archive but this means that the data can be recovered only with the same tool
- If the absolute file path changes, grsync will treat it as a different file and re-upload
- Currently, there is no way to recover the local database, but you can download the inventory with aws cli and download individual files with the help of description. I maybe create a tool to re-create the local db with inventory retrieval, but the first issue has to be addressed before.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file glacier-rsync-0.3.6.tar.gz
.
File metadata
- Download URL: glacier-rsync-0.3.6.tar.gz
- Upload date:
- Size: 7.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.2.0 pkginfo/1.6.1 requests/2.25.0 setuptools/50.3.2 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.9.1
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | dec54b91d82c84256ef0d1c1ee0458a0af2064da928d05487be4184569002b0e |
|
MD5 | daad46d3144f4641b5a027d7e2bb874c |
|
BLAKE2b-256 | d3ab058e0bc55dd7103db92c39ea4edafc3e07affdd23b1f059dfc2d5f985f15 |
File details
Details for the file glacier_rsync-0.3.6-py3-none-any.whl
.
File metadata
- Download URL: glacier_rsync-0.3.6-py3-none-any.whl
- Upload date:
- Size: 15.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.2.0 pkginfo/1.6.1 requests/2.25.0 setuptools/50.3.2 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.9.1
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 8cd4ff9abda7dd97fefccf79cb63283b2dcae858fe65c0e4c199511c2900565e |
|
MD5 | 416a810c1e0d3d2dc4ddfd011346a7cb |
|
BLAKE2b-256 | 37f92957b6480e227fb1e1a76a8475eece5aa1fdd972568856ae8e18bcf24887 |