Skip to main content

Addons for opentaskpy, giving it the ability to push/pull via AWS S3, and pull variables from AWS SSM Parameter Store.

Project description

PyPi unittest status Coverage License Issues Stars

This repository contains addons to allow integration with AWS components via Open Task Framework (OTF)

Open Task Framework (OTF) is a Python based framework to make it easy to run predefined file transfers and scripts/commands on remote machines.

These addons include several additional features:

  • A new plugin for SSM Param Store to pull dynamic variables
  • A new remotehandler to push/pull files via AWS S3
  • A new remote handler to trigger AWS Lambda functions

AWS Credentials

This package uses boto3 to communicate with AWS.

Credentials can be set via config using equivalently named variables alongside the protocol definition e.g;

"protocol": {
    "name": "opentaskpy.addons.aws.remotehandlers.s3.S3Transfer",
    "access_key_id": "some_key",
    "secret_access_key": "some_secret_key",
    "assume_role_arn": "arn:aws:iam::000000000000:role/some_role",
    "region_name": "eu-west-1"
}

If the standard AWS environment variables are set, then these will be used if not set elsewhere. Otherwise, if running from AWS, then the IAM role of the machine running OTF will be used.

Transfers

Transfers are defined the same as a normal SSH based transfer.

As part of the upload, the bucket-owner-full-control ACL flag is applied to all files. This can be disabled by setting disableBucketOwnerControlACL to true in the protocol definition

Supported features

  • Plain file watch
  • File watch/transfer with file size and age constraints
  • move, rename & delete post copy actions
  • Touching empty files after transfer. e.g. .fin files used as completion flags
  • Touching empty files as an execution

Limitations

  • No support for log watch

Configuration

JSON configs for transfers can be defined as follows:

Example File Watch Only

"source": {
  "bucket": "test-bucket",
  "fileWatch": {
    "timeout": 15,
    "directory": "src",
    "fileRegex": ".*\\.txt"
  },
  "protocol": {
    "name": "opentaskpy.addons.aws.remotehandlers.s3.S3Transfer"
  }
}

Example S3 Download

"source": {
  "bucket": "some-bucket",
  "directory": "src",
  "fileRegex": ".*\\.txt",
  "protocol": {
    "name": "opentaskpy.addons.aws.remotehandlers.s3.S3Transfer"
  }
}

Example S3 Upload

"destination": [
    {
        "bucket": "some-bucket",
        "directory": "dest",
        "protocol": {
          "name": "opentaskpy.addons.aws.remotehandlers.s3.S3Transfer"
        }
    }
]

Example S3 upload with flag files

"destination": [
    {
        "bucket": "some-bucket",
        "directory": "dest",
        "flag": {
          "fullPath": "dest/some_fin.flg"
        }
        "protocol": {
          "name": "opentaskpy.addons.aws.remotehandlers.s3.S3Transfer"
        }
    }
]

Executions

The Lambda remote handler allows AWS Lambda functions to be called. When provided with a functionArn the function will be called with no parameters. If there's a payload to pass in, use the payload attribute in the execution definition to specify a JSON object to pass into the function.

Asynchronous vs Synchronous Execution

Lambda functions can be called with either an invocationType of Event (default if not specified) or RequestResponse.

Event is asynchronous, and tells the Lambda function to trigger, but does not check that it ran successfully. This means it's up to you to make sure that you have appropriate monitoring of your Lambda functions.

RequestResponse will block until the Lambda function either completes, or times out. Boto3 has a timeout of 60 seconds, so this cannot be used for long running functions (over 1 minute). This also causes issues when used in conjunction with batches and timeouts. Since the request blocks, the thread cannot be killed by the batch thread, meaning that it will block any further execution until 60 seconds after triggering the lambda function.

Example S3 Execution touch flag file

{
  "type": "execution",
  "bucket": "test-bucket",
  "key": "test_key.flg",
  "protocol": {
    "name": "opentaskpy.addons.aws.remotehandlers.s3.S3Transfer"
  }
}

Example Lambda function call

{
  "type": "execution",
  "functionArn": "arn:aws:lambda:eu-west-1:000000000000:function:my-function",
  "invocationType": "Event",
  "payload": {
    "file-name": "some_file.txt"
  },
  "protocol": {
    "name": "opentaskpy.addons.aws.remotehandlers.lambda.LambdaExecution"
  }
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

otf_addons_aws-24.44.0.tar.gz (49.5 kB view details)

Uploaded Source

File details

Details for the file otf_addons_aws-24.44.0.tar.gz.

File metadata

  • Download URL: otf_addons_aws-24.44.0.tar.gz
  • Upload date:
  • Size: 49.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.4

File hashes

Hashes for otf_addons_aws-24.44.0.tar.gz
Algorithm Hash digest
SHA256 ea702c33b2f4c582948e92836d36a67a2c9803941c4df735c53ea53f70876d0c
MD5 bc88595d51c5f56a4545e0968de0c4bc
BLAKE2b-256 23833182fdda38fad489b3af6b9076766fdac03ac2a15271ee4f4fe3f3454cf4

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page