Skip to main content

This repository contains code for input and output operations on s3 buckets

Project description

AWS S3 Operations

This repository contains a package for various i/o functions on s3 buckets.

To install

pip install aws_s3_ops

To uninstall

pip uninstall aws_s3_ops

Usage

The list of available functions are:

save_pickle

from aws_s3_ops.aws_s3_ops import S3Operations

obj_s3 = S3Operations()
bucket = "your-bucket-name-here"
key = "your/folder/path/inside/bucket/pickle.pkl"
obj = RandomClassObject()

obj_s3.save_pickle(bucket=bucket, key=key, obj=obj)  # Returns boolean

load_pickle

from aws_s3_ops.aws_s3_ops import S3Operations

obj_s3 = S3Operations()
bucket = "your-bucket-name-here"
key = "your/folder/path/inside/bucket/pickle.pkl"

obj = obj_s3.load_pickle(bucket=bucket, key=key)  # Loads unpickled object from s3

save_csv

from aws_s3_ops.aws_s3_ops import S3Operations
import pandas as pd

obj_s3 = S3Operations()
bucket = "your-bucket-name-here"
key = "your/folder/path/inside/bucket/file.csv"
df = pd.DataFrame([['a','b'],['c', 'd']], columns=['col1', 'col2'])

obj_s3.save_csv(bucket=bucket, key=key, df=df, index=False)

key = "your/folder/path/inside/bucket/file.csv.gzip"
obj_s3.save_csv(bucket=bucket, key=key, df=df, compression="gzip", index=False)

save_json

from aws_s3_ops.aws_s3_ops import S3Operations
import pandas as pd

obj_s3 = S3Operations()
bucket = "your-bucket-name-here"
key = "your/folder/path/inside/bucket/file.json"
df = pd.DataFrame([['a','b'],['c', 'd']], columns=['col1', 'col2'])

obj_s3.save_json(bucket=bucket, key=key, df=df)

key = "your/folder/path/inside/bucket/file.json.gzip"
obj_s3.save_csv(bucket=bucket, key=key, df=df, compression="gzip")

download_file

from aws_s3_ops.aws_s3_ops import S3Operations

obj_s3 = S3Operations()
bucket = "your-bucket-name-here"
key = "your/folder/path/inside/bucket/file_to_download.random"
local_path = "path/for/file/within/local/file_downloaded.random"

obj_s3.download_file(bucket=bucket, key=key, local_path=local_path)

upload_file

from aws_s3_ops.aws_s3_ops import S3Operations

obj_s3 = S3Operations()
bucket = "your-bucket-name-here"
key = "your/folder/path/inside/bucket/file_uploaded.random"
local_path = "path/for/file/within/local/file_to_upload.random"

obj_s3.upload_file(bucket=bucket, key=key, local_path=local_path)

key_exists

from aws_s3_ops.aws_s3_ops import S3Operations

obj_s3 = S3Operations()
bucket = "your-bucket-name-here"
key = "your/folder/path/inside/bucket/file_exists.random"

file_existence_boolean = obj_s3.key_exists(bucket=bucket, key=key)

delete_data

from aws_s3_ops.aws_s3_ops import S3Operations

obj_s3 = S3Operations()
bucket = "your-bucket-name-here"
key = "your/folder/path/inside/bucket/file_to_delete.random"

obj_s3.delete_data(bucket=bucket, key=key)

key = "your/folder/path/inside/bucket/folder_to_delete"

obj_s3.delete_data(bucket=bucket, key=key)

get_prefix_object

from aws_s3_ops.aws_s3_ops import S3Operations

obj_s3 = S3Operations()
bucket = "your-bucket-name-here"
key = "your/folder/path/inside/bucket/"

# List of all folders and files within the folder
keys = obj_s3.get_prefix_object(bucket=bucket, key=key)

# List of all folders and files within the folder with the given extension
keys = obj_s3.get_prefix_object(bucket=bucket, key=key, file_extension="txt")

get_file_buffer

from aws_s3_ops.aws_s3_ops import S3Operations
import pandas as pd

obj_s3 = S3Operations()
bucket = "your-bucket-name-here"
key = "your/folder/path/inside/bucket/file.txt"

# This object can then be read using pandas or simple python file operations
buf = obj_s3.get_file_buffer(bucket=bucket, key=key)

pd.read_csv(buf)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

aws_s3_ops-1.0.0.tar.gz (6.6 kB view details)

Uploaded Source

Built Distribution

aws_s3_ops-1.0.0-py2.py3-none-any.whl (6.0 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file aws_s3_ops-1.0.0.tar.gz.

File metadata

  • Download URL: aws_s3_ops-1.0.0.tar.gz
  • Upload date:
  • Size: 6.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/42.0.2 requests-toolbelt/0.9.1 tqdm/4.40.1 CPython/3.7.1

File hashes

Hashes for aws_s3_ops-1.0.0.tar.gz
Algorithm Hash digest
SHA256 c884d44a57cc000a6858e4e8ccf1e8e017033371d41d2474ecaa61e763788d2d
MD5 c3d71d57268c5723e459f6682a7822c4
BLAKE2b-256 47cbbc31283a3bb7354aceaac3076b6de225e592319ea4d211557b259f12b824

See more details on using hashes here.

File details

Details for the file aws_s3_ops-1.0.0-py2.py3-none-any.whl.

File metadata

  • Download URL: aws_s3_ops-1.0.0-py2.py3-none-any.whl
  • Upload date:
  • Size: 6.0 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/42.0.2 requests-toolbelt/0.9.1 tqdm/4.40.1 CPython/3.7.1

File hashes

Hashes for aws_s3_ops-1.0.0-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 d5bc08541cbc6d51082a4ca96775131bae9ba3983fbd87c574785cd0bca32ee2
MD5 1e206c18d77df2ff3a936ec4c87f58d0
BLAKE2b-256 ebd0f8c92ab02987df17dbe573046b8756249e940188dd4d50e3bb5e32a116d1

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page