S3 Uploader for CI/CD pipeline
Project description
S3 Upload Script
Python script for uploading files to an Amazon S3 bucket. It supports uploading multiple files, specifying an upload directory, and filtering files based on file patterns. Main focus of this project is for usage in CI/CD pipelines and it is published like PYPI package to https://pypi.org/project/s3uploader-ci-cd/.
Requirements
- Python 3.9 or higher
boto3package for AWS S3 communicationpython-dotenvpackage for loading environment variables from a .env file
Installation
Install the required packages using pip:
pip install s3uploader-ci-cd
Usage
- Set up your AWS credentials like enviroment variables:
AWS_ACCESS_KEY_ID=your_access_key_id
AWS_SECRET_ACCESS_KEY=your_secret_access_key
Replace your_access_key_id and your_secret_access_key with your actual AWS access key and secret key.
- Run the script with the required command-line arguments:
python s3upload.py --bucket_name BUCKET_NAME --upload_prefix UPLOAD_PREFIX --source_dir SOURCE_DIR --include INCLUDE_PATTERN
Replace BUCKET_NAME with the name of your S3 bucket, region, UPLOAD_PREFIX with the desired prefix for the uploaded files, SOURCE_DIR with the relative path of the directory containing the files for upload, and INCLUDE_PATTERN with a comma-separated list of file patterns to include in the upload.
python s3upload.py --bucket_name my-bucket -- region my_region --upload_prefix my-prefix --source_dir my-files --include "*.txt,*.pdf"
- This command will upload all .txt and .pdf files from the my-files directory to the my-bucket S3 bucket with the my-prefix prefix.
Gitlab CI/CD pipeline
- .gitlab-ci.yml
publish-to-s3:
stage: publish
tags:
- docker-linux
image: python:3.11
before_script:
# Set the environment variables for AWS credentials
- export AWS_ACCESS_KEY_ID=$AWS_ACCESS_KEY_ID
- export AWS_SECRET_ACCESS_KEY=$AWS_SECRET_ACCESS_KEY
- pip install --upgrade s3uploader
script:
- python3 -m s3uploader --bucket_name bucket_name --source_dir src --include file.json --upload_prefix test/my_path
Command-line Arguments
--bucket_name: The name of the S3 bucket--region: Region (default: 'eu-west-1')--upload_prefix: The S3 object key prefix for the uploaded files--upload_prefix_config_file: The path to the output_path config file containing the upload prefix (default: 'output_path.txt')--source_dir: The relative path of the directory containing the files for upload (default: 'dist/')--include: A comma-separated string of file patterns to include in the upload (default: '*')--exclude: A comma-separated string of file patterns to exclude in the upload (default: '')
Functions
The script includes the following functions:
comma_separated_string(string: str): Converts a comma-separated string into a list of strings.parse_args(sys_args): Parses command-line arguments for the script.upload_file(bucket_name: str, file_path: str, key: str): Uploads a file to an AWS S3 bucket using the regular upload method.get_files_to_upload(source_path: pathlib.Path, include_pattern: list[str]): Retrieves a list of files in the source directory that match the include patterns.upload_files_to_s3(bucket_name: str, files: list[pathlib.Path], upload_prefix: str, source_path: pathlib.Path): Uploads each file in the given list to an AWS S3 bucket.construct_source_path_for_upload(source_dir: str): Constructs the absolute path for the source directory of files to be uploaded.construct_upload_prefix(upload_prefix: str, output_path_config: pathlib.Path): Constructs the final upload prefix for the files in the AWS S3 bucket.main(bucket_name: str, upload_prefix: str, upload_prefix_config_file: str, source_dir: str, include_pattern: str): Main function that uploads files to an AWS S3 bucket.
Development
Documentation about developement setup for this project CONTRIBUTING.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file s3uploader_ci_cd-1.0.4.tar.gz.
File metadata
- Download URL: s3uploader_ci_cd-1.0.4.tar.gz
- Upload date:
- Size: 5.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.11.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
3639e551e87e67af5b2cbac774095a2377209163690dd8031b25e3a72c57faf5
|
|
| MD5 |
cbe59d4657ed092963a50b3d6d360142
|
|
| BLAKE2b-256 |
afb908c1bb361e2b82fda8a0accb7d264e3aff25e046902ce20d784dcf1c05e0
|
File details
Details for the file s3uploader_ci_cd-1.0.4-py3-none-any.whl.
File metadata
- Download URL: s3uploader_ci_cd-1.0.4-py3-none-any.whl
- Upload date:
- Size: 6.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.11.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e3a4ff42beb06a51dbc0effcf53ffe43b8c8feb6cc1512453dc4a3d8e6233072
|
|
| MD5 |
06b4bdc56277aabfc0d1161ea00565a6
|
|
| BLAKE2b-256 |
9dc49971594d027f56072e810b869427fdbd6f26b9d24a4e666c05fbcce53b03
|