Skip to main content

A package for reading/writing certain file types to the AWS S3 service using boto3.

Project description

psusannx_s3

A package that allows a connection to the Amazon simple storage service (S3) to be made, provided a valid set of AWS credentials are provided. This connection is set up through a python class which, once created, can allow for easy read/write functions for certain file extensions (.csv, .json, .pkl, .h5). It uses the boto3 package to perform these actions. The docs for using boto3 can be found here.

This package was created to be used as a subpackage in a wider project - PSUSANNX.

Package classes

  • PsusannxS3

Installation

pip install psusannx-s3

Usage

# Import the function from the package
from psusannx_s3 import PsusannxS3

# Get some info about the function
help(PsusannxS3) 

Set up a connection to the S3 service by instantiating the class with valid credentials. And store that connection in a variable s3.

# Store the AWS credentials as variables
AWS_ACCESS_KEY = "<aws-access-key>"
AWS_SECRET_KEY = "<aws-secret-key>"

# Set up the connection to AWS s3 through the PsusannxS3 class
s3 = PsusannxS3(
    aws_access_key=AWS_ACCESS_KEY, 
    aws_secret_key=AWS_SECRET_KEY
)

Now that we have created a connection to the S3 service with the credentials provided, we can use the class to read/write files to S3. We will demonstrate this with a pickled list object but the same steps & syntax apply for each of the other file types. For this step to work, you need to set up an AWS account & have a bucket created in S3 in your account (and that the credentials provided above have the required permissions to read/write to the S3 service).

# Create a listthat we want to put in S3
test_list = [1, 2, 3]

# Use the s3 instance of the class to put the list in S3
s3.write_pkl_to_s3(
    bucket_name="<s3-bucket-name>",
    object_name="<path\to\file\in\bucket\file_name.pkl>",
    data=test_list
)

We can now read the same file/list from the S3 bucket into the code, and even save the file to the local directory if necessary.

# Read the pickled list from S3 to the test_list variable 
# & aslo persist the file to the current directory using retain=True
test_list = s3.read_pkl_from_s3(
    bucket_name="<s3-bucket-name>",
    object_name="<path\to\file\in\bucket\file_name.pkl>",
    retain=True
)

# Print the list to screen
print(test_list)

Notes

  • The package is quite restricted in what it can do, but it only needs to do things that are required by the parent project so there won't be much development.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

psusannx_s3-0.0.7.tar.gz (4.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

psusannx_s3-0.0.7-py3-none-any.whl (4.9 kB view details)

Uploaded Python 3

File details

Details for the file psusannx_s3-0.0.7.tar.gz.

File metadata

  • Download URL: psusannx_s3-0.0.7.tar.gz
  • Upload date:
  • Size: 4.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.0.1 CPython/3.11.7

File hashes

Hashes for psusannx_s3-0.0.7.tar.gz
Algorithm Hash digest
SHA256 296ed8e04bf6e5a0dbc1fd65cc0412f09646fbac2f02746be4abedfcfc66a22e
MD5 743f69db7ddd298e8cb3f884f89186b8
BLAKE2b-256 92c6cb08606b034c1d065b2f731160c9e419666bbfbf398c7167c5398f4ab674

See more details on using hashes here.

File details

Details for the file psusannx_s3-0.0.7-py3-none-any.whl.

File metadata

  • Download URL: psusannx_s3-0.0.7-py3-none-any.whl
  • Upload date:
  • Size: 4.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.0.1 CPython/3.11.7

File hashes

Hashes for psusannx_s3-0.0.7-py3-none-any.whl
Algorithm Hash digest
SHA256 55a58825946eb211ac51427148b04732373fba6dc6d957b44024210d382013a9
MD5 54b3141e5dcfa048fb3eb7913ef0110f
BLAKE2b-256 605e1405c393d5f9ce19c9a84cd8b88cf898645080f6bd7a536f2371b6699f5a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page