Skip to main content

Tar (and compress) files in s3

Project description

s3-tar

PyPI PyPI

Create a tar/tar.gz/tar.bz2 file from many s3 files and stream back into s3.
*Currently does not preserve directory structure, all files will be in the root dir

Install

pip install s3-tar

Usage

Command Line

$ s3-tar -h

Import

from s3_tar import S3Tar

bucket = 'YOUR_BUCKET_NAME'
path_to_tar = 'PATH_TO_FILES_TO_CONCAT'
tared_file = 'FILE_TO_SAVE_TO.tar'  # use `tar.gz` or `tar.bz2` to enable compression
# Setting this to a size will always add a part number at the end of the file name
min_file_size = '50MB'  # ex: FILE_TO_SAVE_TO-1.tar, FILE_TO_SAVE_TO-2.tar, ...
# Setting this to None will create a single tar with all the files
# min_file_size = None

# Init the job
job = S3Tar(bucket, tared_file,
            min_file_size=min_file_size,
            target_bucket=None,  # Can be used to save the archive into a different bucket
            # session=boto3.session.Session(),  # For custom aws session
)
# Add files, can call multiple times to add files from other directories
job.add_files(path_to_concat)
# Add a single file at a time
job.add_file('some/file_key.json')
# Star the tar'ing job after files have been added
job.tar()

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

s3-tar-0.1.2.tar.gz (4.9 kB view hashes)

Uploaded Source

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page