Archive PostgreSQL tables to S3
Project description
flush
Archive PostgreSQL tables to S3.
CLI tool that flushes (exports and deletes) all database rows of a PostgreSQL table into a CSV in a specified S3 bucket. Useful for archiving data and saving money in small infrastructure environments.
Usage
Ensure you are using correct AWS credentials (as if you were using BOTO: https://boto3.amazonaws.com/v1/documentation/api/latest/guide/quickstart.html#configuration).
Then:
pip install -e .
flush postgres://localhost:5432/mydatabase tablename mybucket --truncate
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
flush-0.3.tar.gz
(2.7 kB
view details)
File details
Details for the file flush-0.3.tar.gz
.
File metadata
- Download URL: flush-0.3.tar.gz
- Upload date:
- Size: 2.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.21.0 setuptools/40.8.0 requests-toolbelt/0.9.1 tqdm/4.31.1 CPython/3.7.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 0fcd29f89b86ad4bf257ddfa71d1219f4fe5b22df9c3d05fd439f094b851e59b |
|
MD5 | 415bdf089b9f5ff5cabaa9b4c08f9173 |
|
BLAKE2b-256 | 7c448429023cd10da4eaaf64b50a042ad09d80fd99df4036af556a4fa2d7960c |