Skip to main content

AWS CDK custom resource that handles large files deployment to S3 bucket.

Project description

Pipeline

B.CfnS3LargeDeployment

b-cfn-s3-large-deployment - AWS CDK custom resource that handles large files deployment to S3 bucket.

Description

This custom resource deploys local files or S3 bucket objects to a destination bucket retaining their file-system hierarchy.

Two types of deployment sources are available:

  • BucketDeploymentSource - uses another S3 bucket object(-s) as source for the deployment to a destination bucket. Only files up to 5TB are supported due to S3 bucket limitations;
  • AssetDeploymentSource - uses aws-cdk.aws-s3-assets lib to deploy local files as .zip files to assets bucket from which extracted contents are moved to the destination bucket. Asset files more than 2GB in size are not supported.

See "Known limits" sections below for more information on this resource limitations.

This resource implementation is based on GitHub pull-request https://github.com/aws/aws-cdk/pull/15220.

Remarks

Biomapas aims to modernise life-science industry by sharing its IT knowledge with other companies and the community.

Related technology

  • Python >= 3.8
  • Amazon Web Services (AWS)

Assumptions

The project assumes that the person working with it have basic knowledge in python programming.

Useful sources

See code documentation for any additional sources and references. Also see aws-cdk.s3-deployment library for more information as this implementation is based on work done there.

Install

Use the package manager pip to install this package. This project is not in the PyPi repository yet. Install directly from source or PyPI.

pip install .

Or

pip install b-cfn-s3-large-deployment

Usage & Examples

This AWS CloudFormation custom resource is used pretty much the same way as any other resource. Simply initialize it within any valid CDK scope giving it unique name/id, providing source(-s) and the destination for the deployment.

The deployment of files depends on AWS Lambda's /tmp directory and its size limits. For large files /tmp directory size can be configured using Ephemeral storage (DeploymentProps.ephemeral_storage_size) supported by AWS Lambda functions.

Optionally, if there's a need for even larger files deployment than what AWS Lambda's /tmp directory supports, setting the DeploymentPops.use_efs and DeploymentPops.efs_props fields, AWS Elastic File Storage (EFS) can be enabled to allow such files handling.

A simple example of S3LargeDeploymentResource usage is shown below:

from aws_cdk.core import App, Stack, Construct
from aws_cdk.aws_s3 import Bucket

from b_cfn_s3_large_deployment.resource import S3LargeDeploymentResource
from b_cfn_s3_large_deployment.deployment_props import DeploymentProps
from b_cfn_s3_large_deployment.deployment_source import AssetDeploymentSource, BucketDeploymentSource


class ExampleStack(Stack):
    def __init__(self, scope: Construct):
        super().__init__(...)

        S3LargeDeploymentResource(
            scope=self,
            name='ExampleLargeDeployment',
            sources=[
                AssetDeploymentSource(path='/path/to/your/local/directory'),
                AssetDeploymentSource(path='/path/to/your/local/zip/file.zip'),
                BucketDeploymentSource(
                  bucket=..., 
                  zip_object_key='your-source-bucket-object-key'
                ),
                ...
            ],
            destination_bucket=Bucket(...),
            props=DeploymentProps(...)
        )
        ...


app = App()
ExampleStack(app, 'ExampleStack')

app.synth()

Here, three types of supported sources were used:

  1. whole, local directory given as a path, which is then deployed to the assets bucket as a .zip object:

    AssetDeploymentSource(path='/path/to/your/local/directory')
    
  2. single .zip file given as a path, which is then deployed to the assets bucket:

    AssetDeploymentSource(path='/path/to/your/local/zip/file.zip')
    
  3. Single .zip S3 object found in the source bucket, given as an object key. No further pre-processing is applied in this case:

    BucketDeploymentSource(
       bucket=...,
       zip_object_key='your-source-bucket-object-key'
    )
    

In all of these cases, final, source .zip objects are extracted inside S3LargeDeploymentResource's handler function storage and the available contents are then deployed to the configured destination. This is all done, while maintaining original file structure of source contents.

Known limits

  • aws_cdk.aws_s3_assets.Asset supports up to 2GB/asset (limited by NodeJS implementation).
  • S3 bucket supports up to 5TB objects.

Contribution

Found a bug? Want to add or suggest a new feature? Contributions of any kind are gladly welcome. Contact us, create a pull-request or an issue ticket.

Release history

1.2.0

  • Add Ephemeral storage support.

1.1.3

  • Fixed dependencies bug between S3LargeDeploymentResource and its handler function.
  • Upgraded testing pipeline to use B.AwsCdkParallel.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

b-cfn-s3-large-deployment-1.2.0.tar.gz (18.6 kB view details)

Uploaded Source

Built Distribution

b_cfn_s3_large_deployment-1.2.0-py3-none-any.whl (19.9 kB view details)

Uploaded Python 3

File details

Details for the file b-cfn-s3-large-deployment-1.2.0.tar.gz.

File metadata

File hashes

Hashes for b-cfn-s3-large-deployment-1.2.0.tar.gz
Algorithm Hash digest
SHA256 2f0414ba588e12a836392f95200df929aab23973dfb5822ce023221e40dece40
MD5 b47ec57b2bd06270490007833b740bf8
BLAKE2b-256 fb77516a06c88bdade6782bc7167aae27329a7e64779a66857234c17f77c2a3a

See more details on using hashes here.

File details

Details for the file b_cfn_s3_large_deployment-1.2.0-py3-none-any.whl.

File metadata

File hashes

Hashes for b_cfn_s3_large_deployment-1.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 81b0a9a38e9b5fa3106ae8faeca8a7394ad1fc1ad82b5d14bba279d180baaeae
MD5 1d4da6f3c3d026d84ca8e752df159479
BLAKE2b-256 146ea6ef80d36d63a3d60d5211d6790f94d120b3cb53b8b83ea9d27c75002ae3

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page