Skip to main content

PostgreSQL backup tool for Cloudflare R2 (S3 Compatible)

Project description

License Python Storage Database Deploy Docker

Postgres-to-R2 Backup (S3-Compatible)

A lightweight automation service that creates scheduled PostgreSQL backups and securely uploads them to S3-compatible object storage such as Cloudflare R2, AWS S3, Wasabi, Backblaze B2, or MinIO.
Designed specifically as a Railway deployment template, with built-in support for Docker and cron scheduling.


✨ Features

  • 📦 Automated Backups — scheduled daily or hourly PostgreSQL backups
  • 🔐 Optional Encryption — gzip compression or 7z encryption with password
  • ☁️ Cloudflare R2 Integration — seamless S3-compatible storage support
  • 🧹 Retention Policy — automatically delete old backups
  • 🔗 Flexible Database URLs — supports private and public PostgreSQL connection URLs
  • Optimized Performance — parallel pg_dump and multipart S3 uploads
  • 🐳 Docker Ready — portable, lightweight container
  • 🚀 Railway Template First — no fork required for normal usage
  • 🪣 S3-Compatible Storage — works with R2, AWS S3, Wasabi, B2, MinIO
  • 💾 Optional Local Retention — keep backups locally for CLI, VPS, or NAS usage

🚀 Deployment on Railway

  1. Click the Deploy on Railway button below
  2. Railway will create a new project using the latest version of this repository
  3. Add the required environment variables in the Railway dashboard
  4. (Optional) Configure a cron job for your desired backup schedule

Railway uses ephemeral storage. Local backup files are deleted by default after upload.

Deploy on Railway


🔧 Environment Variables (S3-Compatible)

DATABASE_URL=           # PostgreSQL database URL (private)
DATABASE_PUBLIC_URL=    # Public PostgreSQL URL (optional)
USE_PUBLIC_URL=false    # Set true to use DATABASE_PUBLIC_URL

DUMP_FORMAT=dump        # sql | plain | dump | custom | tar
FILENAME_PREFIX=backup  # Backup filename prefix
MAX_BACKUPS=7           # Number of backups to retain
KEEP_LOCAL_BACKUP=false # Keep backup file locally after upload (not recommended on PaaS)

R2_ENDPOINT=            # S3 endpoint URL
R2_BUCKET_NAME=         # Bucket name
R2_ACCESS_KEY=          # Access key
R2_SECRET_KEY=          # Secret key
S3_REGION=us-east-1     # Required for AWS S3 (ignored by R2/MinIO)

BACKUP_PASSWORD=        # Optional: enables 7z encryption
BACKUP_TIME=00:00       # Daily backup time (UTC, HH:MM)

Variable names use R2_* for historical reasons, but any S3-compatible provider can be used by changing the endpoint and credentials. For AWS S3 users: ensure S3_REGION matches your bucket’s region.


☁️ Supported S3-Compatible Providers

This project uses the standard AWS S3 API via boto3, and works with:

  • Cloudflare R2 (recommended)
  • AWS S3
  • Wasabi
  • Backblaze B2 (S3 API)
  • MinIO (self-hosted)

Example Endpoints

Provider Endpoint Example
Cloudflare R2 https://<accountid>.r2.cloudflarestorage.com
AWS S3 https://s3.amazonaws.com
Wasabi https://s3.wasabisys.com
Backblaze B2 https://s3.us-west-004.backblazeb2.com
MinIO http://localhost:9000

⏰ Railway Cron Jobs

You can configure the backup schedule using Railway Cron Jobs:

  1. Open your Railway project
  2. Go to Deployments → Cron
  3. Add a cron job targeting this service

Common Cron Expressions

Schedule Cron Expression Description
Hourly 0 * * * * Every hour
Daily 0 0 * * * Once per day (UTC midnight)
Twice Daily 0 */12 * * * Every 12 hours
Weekly 0 0 * * 0 Every Sunday
Monthly 0 0 1 * * First day of the month

Tips

  • All cron times are UTC
  • Use https://crontab.guru to validate expressions
  • Adjust MAX_BACKUPS to match your schedule

If you use Railway Cron Jobs, the service will start once per execution. In this case, the internal scheduler is ignored after startup.


🖥️ Running Locally or on Other Platforms

It can run on any platform that supports:

  • Python 3.9+
  • pg_dump (PostgreSQL client tools)
  • Environment variables
  • Long-running background processes or cron

Docker images use Python 3.12 by default.
Local execution supports Python 3.9+.

Supported Environments

  • Local machine (Linux / macOS / Windows*)
  • VPS (Netcup, Hetzner, DigitalOcean, etc.)
  • Docker containers
  • Other PaaS providers (Heroku, Fly.io, Render, etc.)

Windows is supported when pg_dump is installed and available in PATH.

Local Requirements

  • Python 3.9+
  • PostgreSQL client tools (pg_dump)
  • pip

Run Manually (Local)

pip install -r requirements.txt
python main.py

Run with Docker (Optional)

Build and run the image locally:

docker build -t postgres-to-r2-backup .
docker run --env-file .env postgres-to-r2-backup

Ensure the container is allowed to run continuously when not using an external cron scheduler.

All scheduling uses UTC by default (e.g. Malaysia UTC+8 → set BACKUP_TIME=16:00 for midnight).

Run from Prebuilt Docker Image

If you downloaded a prebuilt Docker image archive (.tar or .tar.gz), you can run it without building locally:

# Extract the archive (if compressed)
tar -xzf postgres-to-r2-backup_v1.0.6.tar.gz

# Load the image into Docker
docker load -i postgres-to-r2-backup_v1.0.6.tar

# Run the container
docker run --env-file .env postgres-to-r2-backup:v1.0.6

Prebuilt images are architecture-specific (amd64 / arm64).


🧰 Using the CLI (Global Installation)

This project can also be used as a standalone CLI tool, installable via pip, in addition to running as a Railway or Docker service.

Install via pip

pip install pg-r2-backup

Requirements

  • Python 3.9+
  • PostgreSQL client tools (pg_dump) installed and available in PATH

Quick Start (CLI)

mkdir backups
cd backups

pg-r2-backup init      # creates .env from .env.example
pg-r2-backup doctor    # checks environment and dependencies
pg-r2-backup run       # runs a backup immediately

CLI Commands

pg-r2-backup run            # Run backup immediately
pg-r2-backup doctor         # Check environment & dependencies
pg-r2-backup config show    # Show current configuration
pg-r2-backup init           # Create .env from .env.example
pg-r2-backup schedule       # Show scheduling examples
pg-r2-backup --version

Environment Variable Resolution (CLI)

When running via the CLI, environment variables are resolved in the following order:

  1. A .env file in the current working directory (or parent directory)
  2. System environment variables

This allows different folders to maintain separate backup configurations.

Local Backup Behavior (CLI)

By default, pg-r2-backup deletes the local backup file after a successful upload.

To keep a local copy (recommended for local machines, VPS, or NAS):

KEEP_LOCAL_BACKUP=true

Not recommended on PaaS platforms (Railway, Fly.io, Render, Heroku, etc.) due to ephemeral filesystems.

Scheduling Backups (CLI)

The CLI does not run a background scheduler. Use your operating system or platform scheduler instead.

Linux / macOS (cron)

0 0 * * * pg-r2-backup run

Windows (Task Scheduler)

  • Program: pg-r2-backup
  • Arguments: run
  • Start in: folder containing .env (working directory)

Railway / Docker

Use the platform's built-in scheduler (recommended).

💡 Tip
Run pg-r2-backup schedule at any time to see scheduling examples.


🔐 Security

  • Do not expose PostgreSQL directly to the public internet.
    If your database is not on a private network, use a secure tunnel instead.

  • Recommended: Cloudflare Tunnel
    When using a public database URL, it is strongly recommended to connect via a secure tunnel such as Cloudflare Tunnel rather than opening database ports.

  • Protect credentials
    Store all secrets (database URLs, R2 keys, encryption passwords) using environment variables.
    Never commit .env files to version control.

  • Encrypted backups (optional)
    Set BACKUP_PASSWORD to enable encrypted backups using 7z before uploading to S3-compatible storage.

  • Least privilege access
    Use a PostgreSQL user with read-only access where possible, and restrict R2 credentials to the required bucket only.


🛠 Development & Contributions

Fork this repository only if you plan to:

  • Modify the backup logic
  • Add features or integrations
  • Submit pull requests
  • Run locally for development

❓ FAQ

Why only DATABASE_URL?
This matches how most modern platforms expose PostgreSQL credentials.
Support for separate DB variables may be added if there is demand.

📜 License

This project is open source under the MIT License.

You are free to use, modify, and distribute it with attribution.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pg_r2_backup-1.0.6.tar.gz (9.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pg_r2_backup-1.0.6-py3-none-any.whl (9.6 kB view details)

Uploaded Python 3

File details

Details for the file pg_r2_backup-1.0.6.tar.gz.

File metadata

  • Download URL: pg_r2_backup-1.0.6.tar.gz
  • Upload date:
  • Size: 9.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.11

File hashes

Hashes for pg_r2_backup-1.0.6.tar.gz
Algorithm Hash digest
SHA256 fa100604888c70c77a31f71fe42a527481952f371b80102a95c1299b7d03718b
MD5 498f97f8d6f342b4f57faf31f0f443ea
BLAKE2b-256 f3aafd055d1b5a38e357d20ed0c3dfa3cddbb046acbfd5a392c710e4a2975cee

See more details on using hashes here.

File details

Details for the file pg_r2_backup-1.0.6-py3-none-any.whl.

File metadata

  • Download URL: pg_r2_backup-1.0.6-py3-none-any.whl
  • Upload date:
  • Size: 9.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.11

File hashes

Hashes for pg_r2_backup-1.0.6-py3-none-any.whl
Algorithm Hash digest
SHA256 ce029b931fed587ccceb239bef3bf0f3394ec3dd550c607a7ac1f8ab552de099
MD5 766f0e2138c9a8c9dfcb1c7c8e1e0e61
BLAKE2b-256 c5f23c7df7baf5fcc044a965e825e55a3946e122fa5d8cd3c82ee47dec9b7010

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page