Skip to main content

Data common code for AWS Cloud Services by Equinox

Project description

PyPI Version Documentation Status Code Quality Grade Coverage Code of Conduct

Datacoco-cloud contains interaction classes S3, Athena, SES, SNS, SQS, ECS, EMR, Cloudwatch logs

Installation

datacoco-cloud requires Python 3.6+

python3 -m venv <virtual env name>
source <virtual env name>/bin/activate
pip install datacoco-cloud

Usage

S3toS3Interaction

Please take note that all AWS Permissions in IAM and Bucket Policies need to be properly in place for this utility to work. Click here for more details.

Sample Code

# Import the class first
from datacococloud.s3_to_s3_interaction import S3toS3Interaction

# Instantiate with your key pairs
s3toS3 = S3toS3Interaction(<source_aws_key>,
                           <source_aws_secret>,
                           <target_aws_key>,
                           <target_aws_secret>,
                           <source_aws_region>(optional default='us-east-1'),
                           <target_aws_region>(optional default='us-east-1')
                           )

# Copying the files
s3toS3.duplicate_objects(<source_bucket>,
                         <target_bucket>,
                         <source_bucket_prefix>,
                         <target_path>,
                         <source_bucket_suffix>(optional default=''))


# Moving the files
# This deletes the file from the source after copying to the target
s3toS3.move_objects(<source_bucket>,
                         <target_bucket>,
                         <source_bucket_prefix>,
                         <target_path>,
                         <source_bucket_suffix>(optional default=''))

Terms

  • source_aws_key - AWS key ID from source account

  • source_aws_secret - AWS key secret from source account

  • target_aws_key - AWS key ID from target account

  • target_aws_secret - AWS key secret from target account

  • source_aws_region - AWS region of the source S3 bucket

  • target_aws_region - AWS region of the source S3 bucket

  • source_bucket - S3 bucket of the source file

  • target_bucket - S3 bucket where the files are going to be transferred

  • source_bucket_prefix - The prefix of the files to transfer from the source

    Note: Add / at the end to specify a folder e.g (files/)

  • target_path - the Path at the target bucket where the files will be transferred

    Note: if the the folder does not exist, it will auto create it for you

  • source_bucket_prefix - The suffix of the files to transfer from the source

Quickstart

python3 -m venv <virtual env name>
source <virtual env name>/bin/activate
pip install --upgrade pip
pip install -r requirements_dev.txt

Development

Getting Started

It is recommended to use the steps below to set up a virtual environment for development:

python3 -m venv <virtual env name>
source <virtual env name>/bin/activate
pip install -r requirements.txt

Testing

pip install -r requirements_dev.txt

To run the testing suite, simply run the command: tox or python -m unittest discover tests

Contributing

Contributions to datacoco_cloud are welcome!

Please reference guidelines to help with setting up your development environment here.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

datacoco-cloud-0.1.19.tar.gz (17.8 kB view details)

Uploaded Source

File details

Details for the file datacoco-cloud-0.1.19.tar.gz.

File metadata

  • Download URL: datacoco-cloud-0.1.19.tar.gz
  • Upload date:
  • Size: 17.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/3.10.0 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.60.0 CPython/3.7.4

File hashes

Hashes for datacoco-cloud-0.1.19.tar.gz
Algorithm Hash digest
SHA256 c2b601e78f5b8d4d4787f0067e97fdb990dead6479d4d06b257402d59f2e1c19
MD5 a3e4e05e608b7a4e049819997182c9fe
BLAKE2b-256 008a51fd6783579bcaa51f7f233deeba3059ffeccfdbbc5814ec2fd0cab64698

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page