Skip to main content

A robust and scalable solution to ingest files from SFTP to AWS S3.

Project description

https://img.shields.io/pypi/v/sftp_to_s3.svg https://img.shields.io/pypi/l/sftp_to_s3.svg https://img.shields.io/pypi/pyversions/sftp_to_s3.svg https://img.shields.io/badge/STAR_Me_on_GitHub!--None.svg?style=social
https://img.shields.io/badge/Link-Install-blue.svg https://img.shields.io/badge/Link-GitHub-blue.svg https://img.shields.io/badge/Link-Submit_Issue-blue.svg https://img.shields.io/badge/Link-Request_Feature-blue.svg https://img.shields.io/badge/Link-Download-blue.svg https://img.shields.io/badge/Release_History--None.svg?style=social

Welcome to sftp_to_s3 Documentation

sftp_to_s3 is an Open Source solution to ingest large amount of data from SFTP server to AWS S3.

How it Work

This solution assume that the SFTP server folder structure has to have partition strategy. For example, the SFTP server folder structure is like this:

/data/2022-01-01/many-folder-many-files...
/data/2022-01-02/many-folder-many-files...
/data/2022-01-03/many-folder-many-files...
/data/...

Once a partition of files are created on SFTP server, it should not be changed to frequently. Files in each partition are grouped into many batches to ensure that the total amount of files and total file size has a reasonable boundary. A coordinator will be responsible to detect the delta of partitions on SFTP and AWS S3, and then group files into batches. Then the batch data will be sent to a cluster of worker to download in parallel if it fits the SFTP server’s network bandwidth.

The coordinator and worker can be deployed as either AWS Lambda function or AWS Batch job. Also this solution uses DynamoDB to track the status of each partition and each batch to ensure the data integrity.

Please read the following links to learn more:

Partition Tracker Data:

{
    "n_batches": 2, # this partition has 2 batches to download
    "n_succeeded_batches": 2, # number of succeeded batch
    "sftp_dir": "/home/username/part0001", # SFTP directory of this partition
    "s3_dir": "s3://bucket/projects/sftp_to_s3/download/part1", # where you store the downloads
    "n_files": 6, # total number of files in this partition
    "total_size": 5430500 # total size of files in this partition
}

Request Tracker Data:

# this is the first batch
{
    "partition_key": "part1", # partition key
    "request_id": "request1", # request id / batch id
    "sftp_dir": "/home/username/part1", # SFTP directory of this partition
    "s3_dir": "s3://bucket/projects/sftp_to_s3/download/part1", # where you store the downloads
    "s3uri_batch": "s3://bucket/projects/sftp_to_s3/requests/part1/request1.json", # where you store the batch data
    "n_files": "3", # total number of files in this batch
    "total_size": 3442228 # 3.4 MB # total size of files in this batch
}

Batch data JSON file:

# this is the sample content of the batch data
# it is the "s3uri_batch" field in Request Tracker Data
{
    "dir_root": "/home/username/part1", # SFTP directory of this partition
    "s3uri_root": "s3://bucket/projects/sftp_to_s3/download/part1", # where you store the downloads
    "files": [ # list of files to download
        "/home/username/part1/file0001.txt",
        "/home/username/part1/file0002.txt",
        "/home/username/part1/file0003.txt"
    ],
    "n_files": "3", # total number of files in this batch
    "total_size": 3442228 # 3.4 MB # total size of files in this batch
}

Install

sftp_to_s3 is released on PyPI, so all you need is:

$ pip install sftp_to_s3

To upgrade to latest version:

$ pip install --upgrade sftp_to_s3

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

sftp_to_s3-0.1.1.tar.gz (20.0 kB view details)

Uploaded Source

Built Distribution

sftp_to_s3-0.1.1-py2.py3-none-any.whl (20.6 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file sftp_to_s3-0.1.1.tar.gz.

File metadata

  • Download URL: sftp_to_s3-0.1.1.tar.gz
  • Upload date:
  • Size: 20.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.8.13

File hashes

Hashes for sftp_to_s3-0.1.1.tar.gz
Algorithm Hash digest
SHA256 3ed194d209a7f63960fb50a08231f0da7cd691db930fe9cc20702b3a0161eba3
MD5 fd2821ed47da48d1399b03bffa29c655
BLAKE2b-256 a9a2a1634282ba8a48d2be031338f67ecab21c68f9dbc8dea03fdc5b6a39fe69

See more details on using hashes here.

File details

Details for the file sftp_to_s3-0.1.1-py2.py3-none-any.whl.

File metadata

  • Download URL: sftp_to_s3-0.1.1-py2.py3-none-any.whl
  • Upload date:
  • Size: 20.6 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.8.13

File hashes

Hashes for sftp_to_s3-0.1.1-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 f715711ee7d1dd59aab40738bf80eeb982bf8d5774d788ba8741a029b4099eb8
MD5 5893c63721842b4c510c6f69beef46d1
BLAKE2b-256 2b3ea05710da6b59d4faa89d971f7fdb8d70c01ec5b8113d9f7eeb53acf8bf80

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page