Skip to main content

Collect and show info about various backups

Project description

Backup Reporter

This repository contains source code for backup reporter tool. That tool can collect backup information, upload it to S3 buckets or host, then collect bunch of backup information files, get them together into one csv file and upload it to google spreadsheet. Can save information to json and prom format.

Backup reporter has two working modes: reporter and collector.

Installation

To install backup-reporter to some machine (either in reporter or collector mode), ensure you have a python 3.8+ installed on that machine. If so, simply run pip3 install backup-reporter and wait to the end of setup process. After installation will be completed, run backup-reporter -h to get further steps help.

Installation as user

Beware that standard python packages installations which are ran by mean user, won't install console scripts to the PATH, so ensure to do so manually - or run installation as root.

Configuration

Reporter

Reporter can be configured with two ways: script arguments or configuration file. Possible configuration options you can find by typing backup-reporter -h. To use config file just pass --config your_config_file.yml as script argument. All options from cli-help are same for config-file. As example following command:

  • python3 main.py --destination="{'type': 's3'}" --bucket="{'s3_path': 's3://bucket_name/in_bucket_path/metadata_file_name.txt', 'aws_access_key_id': 'key', 'aws_secret_access_key': 'key', 'aws_region': 'region'}" --docker_postgres

can be written in file:

docker_postgres: true
bucket:
    - s3_path: s3://bucket_name/in_bucket_path/metadata_file_name.txt
      aws_access_key_id: key
      aws_secret_access_key: key
      aws_endpoint_url: url
      aws_region: region
      customer: "Customer name"

More examples can be found at docs/config-examples/reporter-*.conf

Collector

Collector can be configured the same way as reporter - with arguments passed to executable file or with config file (which, though, has to be passed as argument too). Example of config for collector with comments:

# Sheet owner is an email of user to whom ownership will be transfered
sheet_owner: s@example.com

# Credentials file is a JSON key which should be given to some service account.
# To understand how to create service account, try to google about a bit
google_spreadsheet_credentials_path: ~/Development/personal/backupreporter_key.json

# This is a name for a target spreadsheet
spreadsheet_name: "Backup-Reports"

# Sheet name in a spreadsheet
worksheet_name: Customers

bucket:
    - s3_path: s3://bucket/metadata/metadata.json
      aws_access_key_id: access-key
      aws_secret_access_key: secret-key
      aws_region: ru-1
      aws_endpoint_url: https://s3.ru-1.storage.selcloud.ru
      customer: Personal

Owner transfership at Google Drive

Spreadsheet Ownership Transfer

Transferring ownership of a Google Spreadsheet is a two-step process:

  1. Initiation The collector marks the spreadsheet with a flag indicating that ownership needs to be transferred.

  2. Acceptance The intended new owner must manually accept ownership by:

    • Opening Google Drive
    • Searching for pendingowner:me
    • Locating the corresponding spreadsheet
    • Accepting the ownership transfer

More information: Google Docs Help: Transfer ownership of a file


Alternative: Share Spreadsheet with Service Account (No Ownership Transfer)

Instead of transferring ownership, you can create the spreadsheet manually and share it with the service account.

  • Note: Ownership remains with you — the service account is only granted access.
  • Required permissions: Editor
  • Service account email: Found in the JSON credentials file referenced by the google_spreadsheet_credentials_path configuration option.

This method allows the service account to read and update the spreadsheet content, but not to manage sharing settings or transfer ownership.

Development

Install poetry first, then simply run poetry install in repository root - and start to develop. To run, run poetry run. To publish new version, change version in pyproject.toml and run poetry build && poetry publish.

Authors

Made in cooperation with:

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

backup_reporter-0.5.3.tar.gz (16.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

backup_reporter-0.5.3-py3-none-any.whl (17.1 kB view details)

Uploaded Python 3

File details

Details for the file backup_reporter-0.5.3.tar.gz.

File metadata

  • Download URL: backup_reporter-0.5.3.tar.gz
  • Upload date:
  • Size: 16.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.3 CPython/3.9.4 Darwin/25.0.0

File hashes

Hashes for backup_reporter-0.5.3.tar.gz
Algorithm Hash digest
SHA256 d51561ced071a06c44f2052ac7ce167dce7adb9bd2a639d59e094ff850432a68
MD5 8969e4e42ce2231316d04a8addcd0156
BLAKE2b-256 b2f09cd68a9912ea8a37ae7505370d77918cc8ee3c37172b813ff044c5e40593

See more details on using hashes here.

File details

Details for the file backup_reporter-0.5.3-py3-none-any.whl.

File metadata

  • Download URL: backup_reporter-0.5.3-py3-none-any.whl
  • Upload date:
  • Size: 17.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.3 CPython/3.9.4 Darwin/25.0.0

File hashes

Hashes for backup_reporter-0.5.3-py3-none-any.whl
Algorithm Hash digest
SHA256 9b6ca7e0d19bafa1f7369d21f18b24d12f5aec04216cbc2e402c70e710b75276
MD5 fa0e992a38351c53b22882256f203802
BLAKE2b-256 440eeee026cce255f29c4a9e01ff12c557b6b3338a229657849a45e7bbc27002

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page