Skip to main content
This is a pre-production deployment of Warehouse. Changes made here affect the production instance of PyPI (
Help us improve Python packaging - Donate today!

Backups files and databases (MySQL, MongoDB) to AWS S3

Project Description


Backups files and databases (MySQL, PostgreSQL, MongoDB) to AWS S3

version 0.1.16 2014-11-15



- backups any given files/directories
- uses mysqldump to take complete SQL dump of all mysql dbs
- uses pg_dump to take complete SQL dump of all postgresql dbs
- uses mongodump to take complete BSON dump of all mongodb dbs
- saves backups for 7 days back and the first of every month indefinitely
- choose between gzip and bzip2 compression algorithms for fastest or
smallest possible footprint
- notifies any given email address every time backup is done
- use bash shell for prettier log coloring

System requirements

It might work with other versions of the software given below,
however this is what I've tested with.

- *nix-like environment
- bzip2 or gzip
- Python >= 2.7.3 (only dep. python-boto)
- MySQL >= 5.0 (optional)
- PostgreSQL >= 9.3 (optional)
- MongoDB >= 2.0.4 (optional)

Getting started

- install python-boto if you haven't already (on Debian/Ubuntu, $ sudo
apt-get install python-boto)
- edit with your settings/credentials
- add a job to your crontab, and set it to run as often as you want backups

Release History

This version
History Node


History Node


Download Files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Filename, Size & Hash SHA256 Hash Help File Type Python Version Upload Date
(7.9 kB) Copy SHA256 Hash SHA256
Source None Nov 15, 2014

Supported By

Elastic Elastic Search Pingdom Pingdom Monitoring Dyn Dyn DNS Sentry Sentry Error Logging CloudAMQP CloudAMQP RabbitMQ Heroku Heroku PaaS Kabu Creative Kabu Creative UX & Design Fastly Fastly CDN DigiCert DigiCert EV Certificate Google Google Cloud Servers DreamHost DreamHost Log Hosting