Skip to main content

Collect and show info about various backups

Project description

Backup Reporter

This repository contains source code for backup reporter tool. That tool can collect backup information, upload it to S3 buckets or host, then collect bunch of backup information files, get them together into one csv file and upload it to google spreadsheet. Can save information to json and prom format.

Backup reporter has two working modes: reporter and collector.

Installation

To install backup-reporter to some machine (either in reporter or collector mode), ensure you have a python 3.8+ installed on that machine. If so, simply run pip3 install backup-reporter and wait to the end of setup process. After installation will be completed, run backup-reporter -h to get further steps help.

Installation as user

Beware that standard python packages installations which are ran by mean user, won't install console scripts to the PATH, so ensure to do so manually - or run installation as root.

Configuration

Reporter

Reporter can be configured with two ways: script arguments or configuration file. Possible configuration options you can find by typing backup-reporter -h. To use config file just pass --config your_config_file.yml as script argument. All options from cli-help are same for config-file. As example following command:

  • python3 main.py --destination="{'type': 's3'}" --bucket="{'s3_path': 's3://bucket_name/in_bucket_path/metadata_file_name.txt', 'aws_access_key_id': 'key', 'aws_secret_access_key': 'key', 'aws_region': 'region'}" --docker_postgres

can be written in file:

docker_postgres: true
bucket:
    - s3_path: s3://bucket_name/in_bucket_path/metadata_file_name.txt
      aws_access_key_id: key
      aws_secret_access_key: key
      aws_endpoint_url: url
      aws_region: region
      customer: "Customer name"

More examples can be found at docs/config-examples/reporter-*.conf

Collector

Collector can be configured the same way as reporter - with arguments passed to executable file or with config file (which, though, has to be passed as argument too). Example of config for collector with comments:

# Sheet owner is an email of user to whom ownership will be transfered
sheet_owner: s@example.com

# Credentials file is a JSON key which should be given to some service account.
# To understand how to create service account, try to google about a bit
google_spreadsheet_credentials_path: ~/Development/personal/backupreporter_key.json

# This is a name for a target spreadsheet
spreadsheet_name: "Backup-Reports"

# Sheet name in a spreadsheet
worksheet_name: Customers

bucket:
    - s3_path: s3://bucket/metadata/metadata.json
      aws_access_key_id: access-key
      aws_secret_access_key: secret-key
      aws_region: ru-1
      aws_endpoint_url: https://s3.ru-1.storage.selcloud.ru
      customer: Personal

Owner transfership at Google Drive

Spreadsheet Ownership Transfer

Transferring ownership of a Google Spreadsheet is a two-step process:

  1. Initiation The collector marks the spreadsheet with a flag indicating that ownership needs to be transferred.

  2. Acceptance The intended new owner must manually accept ownership by:

    • Opening Google Drive
    • Searching for pendingowner:me
    • Locating the corresponding spreadsheet
    • Accepting the ownership transfer

More information: Google Docs Help: Transfer ownership of a file


Alternative: Share Spreadsheet with Service Account (No Ownership Transfer)

Instead of transferring ownership, you can create the spreadsheet manually and share it with the service account.

  • Note: Ownership remains with you — the service account is only granted access.
  • Required permissions: Editor
  • Service account email: Found in the JSON credentials file referenced by the google_spreadsheet_credentials_path configuration option.

This method allows the service account to read and update the spreadsheet content, but not to manage sharing settings or transfer ownership.

Development

Install poetry first, then simply run poetry install in repository root - and start to develop. To run, run poetry run. To publish new version, change version in pyproject.toml and run poetry build && poetry publish.

Authors

Made in cooperation with:

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

backup_reporter-0.5.5.tar.gz (16.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

backup_reporter-0.5.5-py3-none-any.whl (17.2 kB view details)

Uploaded Python 3

File details

Details for the file backup_reporter-0.5.5.tar.gz.

File metadata

  • Download URL: backup_reporter-0.5.5.tar.gz
  • Upload date:
  • Size: 16.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.3.2 CPython/3.14.2 Darwin/25.2.0

File hashes

Hashes for backup_reporter-0.5.5.tar.gz
Algorithm Hash digest
SHA256 3fa0c80aa4b184072aeb01cdb820b435b7d16c32dd3c23ab3aa5cab3ad0cce36
MD5 1deec8be322a9b72618241c12803964d
BLAKE2b-256 c845a714ffddb0190fd5000daf3b45cb512c8119cae6706ee654b8e2639f94e3

See more details on using hashes here.

File details

Details for the file backup_reporter-0.5.5-py3-none-any.whl.

File metadata

  • Download URL: backup_reporter-0.5.5-py3-none-any.whl
  • Upload date:
  • Size: 17.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.3.2 CPython/3.14.2 Darwin/25.2.0

File hashes

Hashes for backup_reporter-0.5.5-py3-none-any.whl
Algorithm Hash digest
SHA256 e0683d91c99de944416d13fdeea15fbe060c9821fe35f313e45f5c412bd683f9
MD5 ad52da1c520ed4522f1531156d46c921
BLAKE2b-256 97925c429c919d39733c9cdb017f7469fa8a774b8ef43eba2ef6e2e5f073e20c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page