Automation of the creation of backups of Postgres databases
Project description
Backup Postgres Database
Basic Usage
This simple Python package allows you to create easily the database backup of Postgres databases. You can upload them to cloud storage buckets by creating a cron job.
from postgres_backup import Backup
# Instantiate the backup object with Postgres database_uri
backup = Backup(database_uri)
# Create the file for backup
backup.create()
Note that the URI has the following structure: db:engine:[//[user[:password]@][host][:port]/][dbname]
.
Why?
This package has proved experience of working well for databases of small-mid size.
Doing this, you make sure you can store your database backups without relying in only one cloud provider or region.
Bucket Storage
Have provided the ability to store those backups in cloud buckets.
Google Cloud Storage
For using this functionality, you need to install the dependencies needed of the package:
pip3 install "postgres-backup[gcs]"
This basically will install also the google
package.
And then after we have the backup created, we would keep following with:
# Create the backup
backup.create()
# Upload it to google cloud storage
backup.upload(
provider=CloudProviders.gcs.value,
)
Where the google_cloud_certification
is a dictionary, with the key-values of the client api keys:
google_cloud_credentials = {
type: "service_account",
project_id: "xxx-saas",
private_key_id: "xxxxxxxx",
private_key: "-----BEGIN PRIVATE KEY-----\nxxxxxxxxxx\n-----END PRIVATE KEY-----\n",
client_email: "xxx@xxx-saas.iam.gserviceaccount.com",
client_id: "xxx",
auth_uri: "https://accounts.google.com/o/oauth2/auth",
token_uri: "https://oauth2.googleapis.com/token",
auth_provider_x509_cert_url: "https://www.googleapis.com/oauth2/v1/certs",
client_x509_cert_url: "https://www.googleapis.com/robot/v1/metadata/x509/xxx%xxx-saas.iam.gserviceaccount.com"
}
Recommended to provide each key as an environmental variable:
- GOOGLE_CLOUD_TYPE -> type
- GOOGLE_CLOUD_PROJECT_ID -> project_id
- GOOGLE_CLOUD_PRIVATE_KEY_ID -> private_key_id
- GOOGLE_CLOUD_PRIVATE_KEY -> private_key
- GOOGLE_CLOUD_CLIENT_EMAIL -> client_email
- GOOGLE_CLOUD_CLIENT_ID -> client_id
- GOOGLE_CLOUD_AUTH_URI -> auth_uri
- GOOGLE_CLOUD_TOKEN_URI -> token_uri
- GOOGLE_CLOUD_AUTH_PROVIDER_X509_CERT_URL -> auth_provider_x509_cert_url
- GOOGLE_CLOUD_CLIENT_X509_CERT_URL -> client_x509_cert_url
Moreover PROJECT_NAME
and BUCKET_NAME
of the google bucket, and finally DATABASE_URL
of Postgres database.
In the case that we do not have a bucket already created for storing the backups, we could add additional parameters to create it:
from postgres_backup.schemas import CloudStorageType, CloudProviders
backup.upload(
provider=CloudProviders.gcs.value,
bucket_name=bucket_name,
create_bucket=True,
storage_class=CloudStorageType.NEARLINE.value
)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for postgres_backup-0.1.9-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 8a71fa264fb8180855e7505951e92a710079d59848dc2a3b60aac1686075b0b2 |
|
MD5 | ca443d111c1da14fd0125fd621472c39 |
|
BLAKE2b-256 | bc7abe21b857154ec579b429a88ddbb17896e9a61666484fea90c80a0a600f4e |