Skip to main content

A package for reading/writing certain file types to the AWS S3 service using boto3.

Project description

psusannx_s3

A package that allows a connection to the Amazon simple storage service (S3) to be made, provided a valid set of AWS credentials are provided. This connection is set up through a python class which, once created, can allow for easy read/write functions for certain file extensions (.csv, .json, .pkl, .h5). It uses the boto3 package to perform these actions. The docs for using boto3 can be found here.

This package was created to be used as a subpackage in a wider project - PSUSANNX.

Package classes

  • PsusannxS3

Installation

pip install psusannx-s3

Usage

# Import the function from the package
from psusannx_s3 import PsusannxS3

# Get some info about the function
help(PsusannxS3) 

Set up a connection to the S3 service by instantiating the class with valid credentials. And store that connection in a variable s3.

# Store the AWS credentials as variables
AWS_ACCESS_KEY = "<aws-access-key>"
AWS_SECRET_KEY = "<aws-secret-key>"

# Set up the connection to AWS s3 through the PsusannxS3 class
s3 = PsusannxS3(
    aws_access_key=AWS_ACCESS_KEY, 
    aws_secret_key=AWS_SECRET_KEY
)

Now that we have created a connection to the S3 service with the credentials provided, we can use the class to read/write files to S3. We will demonstrate this with a pickled list object but the same steps & syntax apply for each of the other file types. For this step to work, you need to set up an AWS account & have a bucket created in S3 in your account (and that the credentials provided above have the required permissions to read/write to the S3 service).

# Create a listthat we want to put in S3
test_list = [1, 2, 3]

# Use the s3 instance of the class to put the list in S3
s3.write_pkl_to_s3(
    bucket_name="<s3-bucket-name>",
    object_name="<path\to\file\in\bucket\file_name.pkl>",
    data=test_list
)

We can now read the same file/list from the S3 bucket into the code, and even save the file to the local directory if necessary.

# Read the pickled list from S3 to the test_list variable 
# & aslo persist the file to the current directory using retain=True
test_list = s3.read_pkl_from_s3(
    bucket_name="<s3-bucket-name>",
    object_name="<path\to\file\in\bucket\file_name.pkl>",
    retain=True
)

# Print the list to screen
print(test_list)

Notes

  • The package is quite restricted in what it can do, but it only needs to do things that are required by the parent project so there won't be much development.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

psusannx_s3-0.0.6.tar.gz (4.8 kB view details)

Uploaded Source

Built Distribution

psusannx_s3-0.0.6-py3-none-any.whl (4.9 kB view details)

Uploaded Python 3

File details

Details for the file psusannx_s3-0.0.6.tar.gz.

File metadata

  • Download URL: psusannx_s3-0.0.6.tar.gz
  • Upload date:
  • Size: 4.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.4

File hashes

Hashes for psusannx_s3-0.0.6.tar.gz
Algorithm Hash digest
SHA256 90a688cac7fbc94529bc716499b693aa2a1d9592fe9952438cff82c105ad0a56
MD5 a42107d280705c690b0e2267801cc0d8
BLAKE2b-256 e2dd1ef8f6d4ca4f7b26d9224c8042106e2ee492d2214618e96a00f089797ba5

See more details on using hashes here.

File details

Details for the file psusannx_s3-0.0.6-py3-none-any.whl.

File metadata

  • Download URL: psusannx_s3-0.0.6-py3-none-any.whl
  • Upload date:
  • Size: 4.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.4

File hashes

Hashes for psusannx_s3-0.0.6-py3-none-any.whl
Algorithm Hash digest
SHA256 02fd27217b902a3ab1509b148ff6533a63719b204a7aed1a5806de72b67847d2
MD5 bad684572a50ea83df34d2f15053f772
BLAKE2b-256 7cb02ff875995b5a770292726d2e92af7f3e79afce3feae9df26ba97f2af24d0

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page