Skip to main content

An SQS client capable of storing oversize message payloads on S3.

Project description

BigSQS

An SQS client capable of storing oversize message payloads on S3.

Overview

AWS SQS is a super useful message queue, but it's sometimes the case that we want to transmit messages larger than the 256KB limit. An official SQS extended client library is available for Java, but not for Python. Similar libraries implementing the protocol used by the original Java library are available for Python, but this library has a few additional features:

  • Fully transparent response structure - MD5 hashes (MD5OfBody) and the content-length header are recomputed clent-side to be correct to the message after resolution of S3 pointers.
  • Unopinionated configuration - The library can use your default (environment) AWS creds (useful for deployment in Lambda functions), take your AWS creds as paremeters and even supports using 2 different credential sets for SQS and S3, even if these belong to different AWS accounts.
  • Leaves boto3 untouched - This library does not attempt to reconfigure/decorate boto3 with additional functionality.
  • Fully documented - The library is fully documented with docstrings, making for an enjoyable development experience.

Installing

Installing the project is very straightforward via pip:

pip install big-sqs

You can then import the library into your project:

from big_sqs import BigSqsClient

Building

Building the library is only necessary if you're doing development work on it, and is not required if you're just importing it to use in your project. To build the library, you'll need to install the Poetry dependency management system for Python. Build the project like so:

poetry build

Usage

Use the library like so:

from big_sqs import BigSqsClient

# Initialize client.
sqs = BigSqsClient.from_default_aws_creds(
    '<my_queue_url>',
    '<my_s3_bucket_name>',
    1024, # For any messages bigger than 1KiB, use S3.
)

# Create 2KiB message.
PAYLOAD_SIZE = 2048
payload = '0' * PAYLOAD_SIZE

# Send message.
sqs.send_message(payload)

# Receive that same message.
dequeued = sqs.receive_messages(1)

# Print the message payload we got back.
print(dequeued)

# Delete messages (S3 objects will also be cleaned up).
for message in recv['Messages']:
    sqs.delete_message(message['ReceiptHandle'])

Configuration

You can configure the library with your SQS credentials in 3 ways:

Using Default (Environment) Creds

To use the default AWS credentials configured for your environment (if any) you can use the from_default_aws_creds static factory method:

from big_sqs import BigSqsClient

# Initialize client.
sqs = BigSqsClient.from_default_aws_creds(
    '<my_queue_url>',
    '<my_s3_bucket_name>',
    1024, # For any messages bigger than 1KiB, use S3.
)

User-Specified Creds

To make use of user-specified AWS credentials, there's a different factory method for you to use:

from big_sqs import BigSqsClient

# Initialize client.
sqs = BigSqsClient.from_aws_creds(
    'us-west-2',
    '<my_aws_access_key_id>',
    '<my_aws_secret_access_key>',
    '<my_queue_url>',
    '<my_s3_bucket_name>',
    1024, # For any messages bigger than 1KiB, use S3.
)

User-Specified Clients

To use a different set of credentials for SQS and S3, or to use different AWS accounts for each, you can supply boto3 clients directly to the BigSqsClient constructor.

from big_sqs import BigSqsClient

# Initialize client.
sqs = BigSqsClient(
    boto3.client(
        'sqs',
        region_name='us-west-2',
        aws_access_key_id='<my_us_aws_access_key_id>',
        aws_secret_access_key='<my_us_aws_secret_access_key>',
    ),
    boto3.client(
        's3',
        region_name='eu-west-2',
        aws_access_key_id='<my_eu_aws_access_key_id>',
        aws_secret_access_key='<my_eu_aws_secret_access_key>',
    ),
    '<my_queue_url>',
    '<my_s3_bucket_name>',
    1024, # For any messages bigger than 1KiB, use S3.
)

Acknowledgements

The authors acknowledge the contribution of the following projects to this library.

Contributors

The main contributors to this project so far are as follows:

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

big-sqs-1.1.0.tar.gz (9.8 kB view details)

Uploaded Source

Built Distribution

big_sqs-1.1.0-py3-none-any.whl (10.1 kB view details)

Uploaded Python 3

File details

Details for the file big-sqs-1.1.0.tar.gz.

File metadata

  • Download URL: big-sqs-1.1.0.tar.gz
  • Upload date:
  • Size: 9.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.1.13 CPython/3.6.9 Linux/5.4.0-126-generic

File hashes

Hashes for big-sqs-1.1.0.tar.gz
Algorithm Hash digest
SHA256 daea7d6c7d7b2022d89d7a57d1a81a086c292fff6a9fb9be634678f8216b5336
MD5 2442ae2499a4dc52fee8aa4f54324c7a
BLAKE2b-256 8b549ebdd266bafcb899f443c0982e41b09432398956cd9baba0ef37e68fab1b

See more details on using hashes here.

File details

Details for the file big_sqs-1.1.0-py3-none-any.whl.

File metadata

  • Download URL: big_sqs-1.1.0-py3-none-any.whl
  • Upload date:
  • Size: 10.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.1.13 CPython/3.6.9 Linux/5.4.0-126-generic

File hashes

Hashes for big_sqs-1.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 e4830b31240142649b183e432a969e50a019cd93f63e9bfd6c505d4edc809002
MD5 c2f866daf5219ae73d2d4112f9231afe
BLAKE2b-256 9eaff1275428cb269e59dbf54692cfea2944c007bd7d105139ceeb83241a4b85

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page