Skip to main content

Tar (and compress) files in s3

Project description

s3-tar

PyPI PyPI

Create a tar/tar.gz/tar.bz2 file from many s3 files and stream back into s3.
*Currently does not preserve directory structure, all files will be in the root dir

Install

pip install s3-tar

Usage

Command Line

To see all command line options run:
s3-tar -h

Import

from s3_tar import S3Tar

# Init the job
job = S3Tar(
    'YOUR_BUCKET_NAME',
    'FILE_TO_SAVE_TO.tar',  # Use `tar.gz` or `tar.bz2` to enable compression
    # min_file_size='50MB',  # The min size to make each tar file [B,KB,MB,GB,TB]. If set, a number will be added to each file name
    # target_bucket=None,  # Default: source bucket. Can be used to save the archive into a different bucket
    # cache_size=5,  # Default 5, Number of files to hold in memory to be processed
    # save_metadata=False,  # If True
    # session=boto3.session.Session(),  # For custom aws session
)
# Add files, can call multiple times to add files from other directories
job.add_files('FOLDER_IN_S3/)
# Add a single file at a time
job.add_file('some/file_key.json')
# Star the tar'ing job after files have been added
job.tar()

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

s3-tar-0.1.4.tar.gz (6.1 kB view hashes)

Uploaded Source

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page