Skip to main content

Async boto3 wrapper

Project description

Async AWS SDK for Python

https://img.shields.io/pypi/v/aioboto3.svg https://travis-ci.com/terrycain/aioboto3.svg?branch=master Documentation Status Updates

This package is mostly just a wrapper combining the great work of boto3 and aiobotocore.

aiobotocore allows you to use near enough all of the boto3 client commands in an async manner just by prefixing the command with await.

With aioboto3 you can now use the higher level APIs provided by boto3 in an asynchronous manner. Mainly I developed this as I wanted to use the boto3 dynamodb Table object in some async microservices.

While all resources in boto3 should work I havent tested them all, so if what your after is not in the table below then try it out, if it works drop me an issue with a simple test case and I’ll add it to the table.

Services

Status

DynamoDB Service Resource

Tested and working

DynamoDB Table

Tested and working

S3

Working

Kinesis

Working

SSM Parameter Store

Working

Athena

Working

Example

Simple example of using aioboto3 to put items into a dynamodb table

import asyncio
import aioboto3
from boto3.dynamodb.conditions import Key


async def main():
    async with aioboto3.resource('dynamodb', region_name='eu-central-1') as dynamo_resource:
        table = dynamo_resource.Table('test_table')

        await table.put_item(
            Item={'pk': 'test1', 'col1': 'some_data'}
        )

        result = await table.query(
            KeyConditionExpression=Key('pk').eq('test1')
        )

        # Example batch write
        more_items = [{'pk': 't2', 'col1': 'c1'}, \
                      {'pk': 't3', 'col1': 'c3'}]
        async with table.batch_writer() as batch:
            for item_ in more_items:
                await batch.put_item(Item=item_)

loop = asyncio.get_event_loop()
loop.run_until_complete(main())

# Outputs:
#  [{'col1': 'some_data', 'pk': 'test1'}]

Things that either dont work or have been patched

As this library literally wraps boto3, its inevitable that some things won’t magically be async.

Fixed:

  • s3_client.download_file* This is performed by the s3transfer module. – Patched with get_object

  • s3_client.upload_file* This is performed by the s3transfer module. – Patched with custom multipart upload

  • s3_client.copy This is performed by the s3transfer module. – Patched to use get_object -> upload_fileobject

  • dynamodb_resource.Table.batch_writer This now returns an async context manager which performs the same function

  • Resource waiters - You can now await waiters which are part of resource objects, not just client waiters, e.g. await dynamodbtable.wait_until_exists()

  • Resource object properties are normally autoloaded, now they are all co-routines and the metadata they come from will be loaded on first await and then cached thereafter.

  • S3 Bucket.objects object now works and has been asyncified. Examples here - https://aioboto3.readthedocs.io/en/latest/usage.html#s3-resource-objects

Amazon S3 Client-Side Encryption

Boto3 doesn’t support AWS client-side encryption so until they do I’ve added basic support for it. Docs here CSE

CSE requires the python cryptography library so if you do pip install aioboto3[s3cse] that’ll also include cryptography.

This library currently supports client-side encryption using KMS-Managed master keys performing envelope encryption using either AES/CBC/PKCS5Padding or preferably AES/GCM/NoPadding. The files generated are compatible with the Java Encryption SDK so I will assume they are compatible with the Ruby, PHP, Go and C++ libraries as well.

Non-KMS managed keys are not yet supported but if you have use of that, raise an issue and i’ll look into it.

Documentation

Docs are here - https://aioboto3.readthedocs.io/en/latest/

Examples here - https://aioboto3.readthedocs.io/en/latest/usage.html

Features

  • Closely mimics the usage of boto3.

Todo

  • More examples

  • Set up docs

  • Look into monkey-patching the aws xray sdk to be more async if it needs to be.

Credits

This package was created with Cookiecutter and the audreyr/cookiecutter-pypackage project template. It also makes use of the aiobotocore and boto3 libraries. All the credit goes to them, this is mainly a wrapper with some examples.

History

6.4.0 (2019-06-20)

  • Updated `upload_fileobj` to upload multiple parts concurrently to make best use of the available bandwidth

6.2.0 (2019-05-07)

  • @inadarei Added batch writing example

  • Added waiter support in resources

  • Made resource object properties coroutines and lazy load data when called

6.2.0 (2019-02-27)

  • Added S3 Client side encryption functionality

6.1.0 (2019-02-13)

  • nvllsvm cleaned up the packaging, requirements, travis, sphinx…

  • Unvendored aiobotocore

6.0.1 (2018-11-22)

  • Fixed dependencies

6.0.0 (2018-11-21)

  • Fixed readthedocs

  • Vendored aiobotocore for later botocore version

5.0.0 (2018-10-12)

  • Updated lots of dependencies

  • Changed s3.upload_fileobj from using put_object to doing a multipart upload

  • Created s3.copy shim that runs get_object then does multipart upload, could do with a better implementation though.

4.1.2 (2018-08-28)

  • updated pypi credentials

4.1.0 (2018-08-28)

  • aiobotocore dependancy bump

4.0.2 (2018-08-03)

  • Dependancy bump

4.0.0 (2018-05-09)

  • Dependancy bump

  • Now using aiobotocore 0.8.0

  • Dropped < py3.5 support

  • Now using async def / await syntax

  • Fixed boto3 dependancy so it only uses a boto3 version supported by aiobotocore’s max botocore dependancy

  • Important, `__call__` in `AIOServiceAction` tries to yield from a coroutine in a non-coroutine, this code shouldn’t be hit anymore but I can’t guarantee that, so instead `__call__` was duplicated and awaited properly so “should” be fine. Credit goes to Arnulfo Solis for doing PR.

3.0.0 (2018-03-29)

  • Dependancy bump

  • Asyncified dynamodb Table Batch Writer + Tests

  • Added batch writer examples

  • Now using aiobotocore 0.6.0

2.2.0 (2018-01-24)

  • Dependancy bump

2.1.0 (2018-01-23)

  • Dependancy bump

  • Fix bug where extras isn’t packaged

2.0.0 (2017-12-30)

  • Patched most s3transfer functions

1.1.2 (2017-11-29)

  • Fixup of lingering GPL license texts

0.1.0 (2017-09-25)

  • First release on PyPI.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

aioboto3-6.5.0.tar.gz (47.5 kB view details)

Uploaded Source

Built Distribution

aioboto3-6.5.0-py2.py3-none-any.whl (28.8 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file aioboto3-6.5.0.tar.gz.

File metadata

  • Download URL: aioboto3-6.5.0.tar.gz
  • Upload date:
  • Size: 47.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.21.0 setuptools/45.2.0 requests-toolbelt/0.9.1 tqdm/4.43.0 CPython/3.7.1

File hashes

Hashes for aioboto3-6.5.0.tar.gz
Algorithm Hash digest
SHA256 327075031f053c6712e2fe875265a0acabf96604f68f3f09e1ce8d65456cc80e
MD5 94ad828ac8c26d016162d20fd31d9927
BLAKE2b-256 eaac62b6edea75daa044e8633f622d7c0c43a1913aa54c1b78bdc2b7ae73c6ca

See more details on using hashes here.

File details

Details for the file aioboto3-6.5.0-py2.py3-none-any.whl.

File metadata

  • Download URL: aioboto3-6.5.0-py2.py3-none-any.whl
  • Upload date:
  • Size: 28.8 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.21.0 setuptools/45.2.0 requests-toolbelt/0.9.1 tqdm/4.43.0 CPython/3.7.1

File hashes

Hashes for aioboto3-6.5.0-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 3ca2b325b1f28906bd0b9311bcdd64b03e2bcc919fa97042be09cbb6e686c471
MD5 ac9575611e47c3e3ebc94fd3d30fc82e
BLAKE2b-256 3556d007b0459f448aa2f593dc9bbb6fe34593d7d7b1a449d207ef234f1fe5cb

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page