Archive PostgreSQL tables to S3
Project description
flush
Archive PostgreSQL tables to S3.
CLI tool that flushes (exports and deletes) all database rows of a PostgreSQL table into a CSV in a specified S3 bucket. Useful for archiving data and saving money in small infrastructure environments.
Usage
Ensure you are using correct AWS credentials (as if you were using BOTO: https://boto3.amazonaws.com/v1/documentation/api/latest/guide/quickstart.html#configuration).
Then:
pip install flush
flush postgres://localhost:5432/mydatabase tablename mybucket --truncate
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
flush-1.0.tar.gz
(2.7 kB
view details)
File details
Details for the file flush-1.0.tar.gz
.
File metadata
- Download URL: flush-1.0.tar.gz
- Upload date:
- Size: 2.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.21.0 setuptools/40.8.0 requests-toolbelt/0.9.1 tqdm/4.31.1 CPython/3.7.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 202691d249b4673cde0d8708ef8f1b77f6224f0563cda92cb64f0a5f5fd1dbc4 |
|
MD5 | 94ff76849d502f71be2e33b9965f40f6 |
|
BLAKE2b-256 | 64c775fb9b004341b734fee590e7120490f0910fd2d8c068620fb423bdee0843 |