Skip to main content

Utilities for AWS S3

Project description

s3-wrapper

s3-wrapper is a wrapper around s3-related functionalities of AWS's boto3 package.

Quick Start

First, install the library:

pip install s3-wrapper

Next, set up credentials (in e.g. ~/.aws/credentials):

[default]
aws_access_key_id = YOUR_KEY
aws_secret_access_key = YOUR_SECRET

You should set the following values in environment variables:

  1. AWS_PROFILE_NAME: AWS profile name
  2. S3_BUCKET_NAME (Optional): Default bucket name

If S3_BUCKET_NAME is not found in environment variables, you must set the default directory before using any utilities:

s3 = S3Utils()
s3.set_default_bucket('test_bucket')

You can use python-dotenv for loading environment variables.

Examples

set_default_bucket

Sets the default bucket for s3-related operations. Usage:

s3.set_default_bucket('bucket_name')

move_object

Assigns a new key to the object inside a bucket. The process involves creating a new object, copy the old object to new object, and delete old object. Usage:

s3.move_object('directory/subdirectory1/file.json', 'directory/subdirectory2/file.json')

If you want to peform this operation on a bucket other than default, use:

s3.move_object('directory/subdirectory1/file.json', 'directory/subdirectory2/file.json', 'bucket_name')

copy_object

Assigns a new key to the object inside a bucket. The process involves creating a new object, copy the old object to new object, and delete old object. Usage:

s3.copy_object('new_object_key', 'src_object_key')

If you want to peform this operation on a bucket other than default, use:

s3.copy_object('new_object_key', 'src_object_key', 'bucket_name')

create_object

Creates a new object inside a bucket and sets its content/body. The process involves creating a new object, copy the old object to new object, and delete old object. Usage:

import json
data = {
  'message': 'Hello world',
  'created_at': '2020-06-03 05:36:00'
}
formatted_data = json.dumps(data)
s3.create_object('key', formatted_data)

If you want to peform this operation on a bucket other than default, use:

s3.create_object('key', formatted_data, 'bucket_name')

upload_file

Uploads a file on disk storage as an object on S3. Usage:

file_path = os.path.join('/tmp', 'subdirectory', 'response.json')
s3.upload_file('file_key', file_path)

If you want to peform this operation on a bucket other than default, use:

file_path = os.path.join('/tmp', 'subdirectory', 'response.json')
s3.upload_file('file_key', file_path, 'bucket_name')

delete_object

Deletes an object from a bucket on S3. Usage:

s3.delete_object('key')

If you want to peform this operation on a bucket other than default, use:

s3.delete_object('key', 'bucket_name')

delete_objects

Deletes objects matching the supplied keys from a bucket. Usage:

s3.delete_objects(['key1, key2', 'key3'])

If you want to peform this operation on a bucket other than default, use:

s3.delete_objects(['key1, key2', 'key3'], 'bucket_name')

find_files_with_prefix

Finds files/objects matching the given prefix. This is helpful if you want to get objects in a specific (hypothetical) directory. Usage:

s3.find_files_with_prefix('/directory/subdirectory/prefix')

If you want to peform this operation on a bucket other than default, use:

s3.find_files_with_prefix('/directory/subdirectory/prefix', 'bucket_name')

file_exists

Returns true if a file exists in a given bucket. Usage:

exists = s3.file_exists('object_key')

If you want to peform this operation on a bucket other than default, use:

exists = s3.file_exists('object_key', 'bucket_name')

generate_presigned_url

Generates a presigned-url for an object in a bucket which expires after given expiration seconds.

url = s3.generate_presigned_url('object_key', 3600)

If you want to peform this operation on a bucket other than default, use:

url = s3.generate_presigned_url('object_key', 3600, 'bucket_name')

download_file

Downloads a file from a bucket to a given path on disk.

s3.download_file('object_key', '/home/directory/path.json')

If you want to peform this operation on a bucket other than default, use:

s3.download_file('object_key', '/home/directory/path.json', 'bucket_name')

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

s3-wrapper-0.0.6.tar.gz (4.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

s3_wrapper-0.0.6-py3-none-any.whl (16.8 kB view details)

Uploaded Python 3

File details

Details for the file s3-wrapper-0.0.6.tar.gz.

File metadata

  • Download URL: s3-wrapper-0.0.6.tar.gz
  • Upload date:
  • Size: 4.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/47.1.0 requests-toolbelt/0.9.1 tqdm/4.49.0 CPython/3.7.9

File hashes

Hashes for s3-wrapper-0.0.6.tar.gz
Algorithm Hash digest
SHA256 74ea9dc3202bbf5bfac042e4363ed0d0b36ef34d50a3c3a0dd49aabeb4d3ece5
MD5 2db4ec77190faad731a508fdda0c49c7
BLAKE2b-256 b0fa9cee07989ef810bc8bbf3e062d8d425c28751a9689622b11660a0c611c05

See more details on using hashes here.

File details

Details for the file s3_wrapper-0.0.6-py3-none-any.whl.

File metadata

  • Download URL: s3_wrapper-0.0.6-py3-none-any.whl
  • Upload date:
  • Size: 16.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/47.1.0 requests-toolbelt/0.9.1 tqdm/4.49.0 CPython/3.7.9

File hashes

Hashes for s3_wrapper-0.0.6-py3-none-any.whl
Algorithm Hash digest
SHA256 acc2ca0d28233ef9f876d5bf9f78766cdbfa5bf4796b6e3329bcd5611e7a23eb
MD5 78f64a7e0e9d1fccb9b058f1384f0420
BLAKE2b-256 c6d039699c972b85d03b240dbfedda0d4afb0c6c6ce8678de83f037d9f59612f

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page