Skip to main content

pypyr task runner AWS plugin. Automate services like ECS, S3, Beanstalk & EC2 with automation pipelines without writing having to write code.

Project description

pypyr aws plug-in

pypyr task runner for automation pipelines

Run anything on aws. No really, anything! If the aws api supports it, the pypyr aws plug-in supports it.

It's a pretty easy way of invoking the aws api as a step in a series of steps without having to write code.

pypyr is a cli & api to run pipelines defined in yaml.

All documentation for the pypyr aws plugin is here: https://pypyr.io/docs/plugins/aws/

Why use this when you could just use the aws-cli instead? The aws cli is all kinds of awesome, but I find more often than not it's not just one or two aws ad hoc cli or aws api methods you have to execute, but especially when automating and scripting you actually need to run a sequence of commands, where the output of a previous command influences what you pass to the next command.

Sure, you can bash it up, and I do that too, but running it as a pipeline via pypyr has actually made my life quite a bit easier because of not having to deal with conditionals, error traps and input validation.

build status coverage statuspypi version apache 2.0 license

installation

$ pip install --upgrade pypyraws

pypyraws depends on the pypyr core. The above pip will install it for you if you don't have it already.

usage

Here is some sample yaml of what a pipeline using the pypyr-aws plug-in to upload a file to s3 would look like:

context_parser: pypyr.parser.keyvaluepairs
steps:
  - name: pypyraws.steps.client
    description: upload a file to s3
    in:
      awsClientIn:
        serviceName: s3
        methodName: upload_file
        methodArgs:
          Filename: ./testfiles/arb.txt
          Bucket: '{bucket}'
          Key: arb.txt

If you saved this yaml as ./pipelines/go-go-s3.yaml, you can run it like this to upload arb.txt to your specified bucket:

$ pypyr go-go-s3 bucket=myuniquebucketname

custom waiters

But wait, there's more! You can make a custom waiter for any aws client operation and wait for a specified field in the response to become the value you want it to be.

This is especially handy for things like Beanstalk, because Elastic Beanstalk does not have Waiters for environment creation.

The input context looks like this:

awsWaitFor:
  awsClientIn: # required. awsClientIn allows the same arguments as pypyraws.steps.client.
    serviceName: elasticbeanstalk
    methodName: describe_environments
    methodArgs:
        ApplicationName: my wonderful beanstalk default application
        EnvironmentNames:
          - my-wonderful-environment
        VersionLabel: v0.1
  waitForField: '{Environments[0][Status]}' # required. format expression for field name to check in awsClient response
  toBe: Ready # required. Stop waiting when waitForField equals this value
  pollInterval: 30 # optional. Seconds to wait between polling attempts. Defaults to 30 if not specified.
  maxAttempts: 10 # optional. Defaults to 10 if not specified.
  errorOnWaitTimeout: True # optional. Defaults to True if not specified. Stop processing if maxAttempts exhausted without reaching toBe value.

Help!

Don't Panic! Check the pypyraws technical docs to begin. For help, community & talk, check pypyr twitter, or join the chat at the pypyr community discussion forum!

contribute

developers

For information on how to help with pypyr, run tests and coverage, please do check out the pypyr contribution guide.

bugs

Well, you know. No one's perfect. Feel free to create an issue.

License

pypyr is free & open-source software distributed under the Apache 2.0 License.

Please see LICENSE in the root of the repo.

Copyright 2017 the pypyr contributors.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pypyraws-1.3.0.tar.gz (18.6 kB view details)

Uploaded Source

Built Distribution

pypyraws-1.3.0-py3-none-any.whl (20.4 kB view details)

Uploaded Python 3

File details

Details for the file pypyraws-1.3.0.tar.gz.

File metadata

  • Download URL: pypyraws-1.3.0.tar.gz
  • Upload date:
  • Size: 18.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.2 importlib_metadata/4.8.1 pkginfo/1.7.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.9.7

File hashes

Hashes for pypyraws-1.3.0.tar.gz
Algorithm Hash digest
SHA256 63647c48379835095cba9acf1e7b5eacbb260f2cb6f31dbfa62d42dade56aac5
MD5 caf4ee7b3f98c2e43877ab5416fe7342
BLAKE2b-256 55d7fbbe1de3472ebe8f7b44d4449c410700b0e4081c2b68b568c8a56a68fa69

See more details on using hashes here.

File details

Details for the file pypyraws-1.3.0-py3-none-any.whl.

File metadata

  • Download URL: pypyraws-1.3.0-py3-none-any.whl
  • Upload date:
  • Size: 20.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.2 importlib_metadata/4.8.1 pkginfo/1.7.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.9.7

File hashes

Hashes for pypyraws-1.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 02ca66c4f4d63b08c7f2227717166876393269c38129cb2c5911ff3ca06983f7
MD5 8a3527b067bf08746f381f186d72572f
BLAKE2b-256 bcce86bdcd36574eae3bde9457b19ecb2ac546757813f65e50db8484bbfa5be0

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page