This recipe is backup for Plone data and send to Amazon S3
Project description
Information
Code repository: https://bitbucket.org/cmscom/c2.recipe.bkups3
Questions and comments to terada@cmscom.jp
Report bugs at https://bitbucket.org/cmscom/c2.recipe.bkups3/issues
Referred to http://pypi.python.org/pypi/collective.recipe.backup
Note
If you are not using repozo, you need to add on buildout.cfg. Because we use repozo by bin/buckup script.
buildout.cfg
[repozo] recipe = zc.recipe.egg eggs = ZODB3 scripts = repozo
Simple usage
Modify buildout.cfg
parts = ... bkups3 [bkups3] recipe = c2.recipe.bkups3 use_s3 = true aws_id = xxxxxxxxxxxxx aws_key = xxxxxxxxxxxxxxxxxxxxxxxxxx bucket_name = xxxxxxxxxx blob_store_len = 7
Run the buildout
bin/buildout -N
You can use backup scripts
bin/bkups3
You got filestorage backup to var/backups , blobstorage backup to var/blobbackups and sending bucket of Amazon S3.
Detailed Documentation
Supported options
The recipe supports the following options:
- blob_bk_dir_name
setting backup path name. defalut: blobbackups
- use_s3
Using S3 is true, Not use S3 is false
- aws_id
<aws access key>
- aws_key
<aws secret key>
- bucket_name
<S3 bucket name> setting unique bucket name in Amazon S3
- blob_store_len
- defalut1
Saved number of blob files
We’ll use all options:
>>> write('buildout.cfg', ... """ ... [buildout] ... parts = bkups3 ... ... [bkups3] ... recipe = c2.recipe.bkups3 ... blob_bk_dir_name = blobbackups ... use_s3 = true # Using S3 -- true, Not use S3 -- false ... aws_id = xxxxxxxxxxxx ... aws_key = xxxxxxxxxxxxxxxxxxxxxxxxx ... bucket_name = xxxxxxxx ... """) >>> print system(buildout) # doctest:+ELLIPSIS Installing backuptos3. backup: Created /sample-buildout/var/backups/blobstorage Generated script '/sample-buildout/bin/bkups3'.
Example usage
Just to isolate some test differences, we run an empty buildout once:
>>> ignore = system(buildout)
We’ll start by creating a buildout that uses the recipe:
>>> write('buildout.cfg', ... """ ... [buildout] ... parts = bkups3 ... ... [bkups3] ... recipe = c2.recipe.bkups3 ... use_s3 = true ... """)
Running the buildout adds a bkups3 scripts to the bin/ directory and, by default, it creates the var/bkups3 dirs:
>>> print system(buildout) # doctest:+ELLIPSIS Installing backuptos3. backup: Created /sample-buildout/var/backups/blobstorage Generated script '/sample-buildout/bin/bkups3'. <BLANKLINE> >>> ls('var') d blobbackups >>> ls('bin') - bkups3 - buildout
Backup
Calling bin/bkups3 results in a normal repozo backup and blobstorage backup and store to Amazon S3. We put in place a mock repozo script that prints the options it is passed (and make it executable). It is horridly unix-specific at the moment.
>>> import sys >>> write('bin', 'repozo', ... "#!%s\nimport sys\nprint ' '.join(sys.argv[1:])" % sys.executable) >>> #write('bin', 'repozo', "#!/bin/sh\necho $*") >>> dontcare = system('chmod u+x bin/repozo')>>> import sys >>> write('bin', 'backup', ... "#!%s\nimport sys\nprint ' '.join(sys.argv[1:])" % sys.executable) >>> #write('bin', 'backup', "#!/bin/sh\necho $*") >>> dontcare = system('chmod u+x bin/backup')
By default, backups are done in var/backuptos3:
>>> print system('bin/bkups3') --backup -f /sample-buildout/var/filestorage/Data.fs -r /sample-buildout/var/backups --gzip INFO: Backing up database file: ...
Contributors
Manabu TERADA(terapyon), Author
Change history
1.0b1 (2011-04-14)
Backup filestorage (using bin/buckup, collective.recipe.backup)
Backup blobstorage
Sending bucket of Amazon S3
1.0a1 (Unreleased)
Created recipe with ZopeSkel [Manabu TERADA(@terapyon)]
Download
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.