Line stream s3 files into ~uniform lumps in S3
Project description
Read files from S3 as defined by a key prefix and map them by lines to a set of optionally gzip compressed output files in S3, with the output files limited by (pre-compressed) file size. The string “{}” in the output key will be substituted with the (zero-based) index of the output files.
s3lncoll: Line stream s3 files into ~uniform lumps in S3 Usage: s3lncoll {{arguments}} {{options}} Arguments: from [text] S3 URL prefix to clump to [text] S3 URL for target clump ('{}' will be the count) Options: -h, --help Show this help message and exit -H, --HELP Help for all sub-commands -D, --debug Enable debug logging -d, --delete Delete source files/keys -j, --json Validate each line as JSONM -q, --quiet Be quiet, be vewy vewy quiet -V, --version Report installed version -z, --compress Ccompress (gzip) the target(s) -b, --blocksize [int] Maximum size of pre-compressed output files in bytes. (default: 1048576)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
s3lncoll-0.1.post11.tar.gz
(23.9 kB
view details)
File details
Details for the file s3lncoll-0.1.post11.tar.gz
.
File metadata
- Download URL: s3lncoll-0.1.post11.tar.gz
- Upload date:
- Size: 23.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 |
89593d71669d28e4284bacb1b06359bb7b7b2561f86a97bc5e2896aa7227c4c5
|
|
MD5 |
7371359b269602969ad49350fd09281c
|
|
BLAKE2b-256 |
7422dfdeeac756fedb125256024800440678c1e925d3c6c044628c1aeb49ee46
|