Skip to main content

AWS SQS queue consumer/publisher

Project description

py-sqs-queue

Simple Python AWS SQS queue consumer and publisher

Installation

python setup.py install

Examples

from sqs_queue import Queue

my_queue = Queue('YOUR_QUEUE_NAME')
for message in my_queue:
    your_process_fn(message)

Or, if you'd like to leave unprocessable messages in the queue to be retried again later:

for message in my_queue:
    try:
        your_process_fn(message)
    except YourRetryableError:
        message.defer()
    except Exception as e:
        logger.warn(e)

And, you can publish to the queue as well:

queue.publish({'MessageId': 123, 'Message': '{"foo": "bar"}'})

If you already have a boto3 queue resource, pass this instead of a name:

import boto3
from sqs_queue import Queue

queue_resource = boto3.resource('sqs').Queue('YOUR_QUEUE_NAME')

my_queue = Queue(queue=queue_resource)

Configuration

You can put your AWS credentials in environment variables or any of the other places boto3 looks.

Other parameters can be passed into the Queue() initiator, or set with environment variables prefixed by SQS_QUEUE_, e.g. SQS_QUEUE_POLL_WAIT.

Parameters

poll_wait and poll_sleep

Behind the scenes, the generator is polling SQS for new messages. When the queue is empty, that call will wait up to 20 seconds for new messages, and if it times out before any arrive it will sleep for 40 seconds before trying again. Those time intervals are configurable:

queue = Queue('YOUR_QUEUE_NAME', poll_wait=20, poll_sleep=40)

drain

Normally, once the queue is empty, the generator waits for more messages. If you just want to process all existing messages and quit, you can pass this boolean parameter:

queue = Queue('YOUR_QUEUE_NAME', drain=True)

For example, if your queue is long and your consumers are falling behind, you can start a bunch of consumers with drain=True and they'll quit when you've caught up.

sns

If your SQS queue is being fed from an SNS topic, you can pass your Queue this boolean parameter, and then your messages will just contain the SNS notification data, so you don't have to fish it out of the SQS message and decode it:

queue = Queue('YOUR_QUEUE_NAME', sns=True)

When you use this option, the sns_message_id is added to the notification data, which can be used to make sure you only process each message once.

create

When you pass create=True then, if your SQS queue name is not found, a queue with that name will be created.

bulk_queue

You can pass this option another Queue, which will be checked only when the primary "priority" queue is empty. For example:

In [1]:   from sqs_queue import Queue

In [2]:   bulk = Queue(
   ...:       queue_name='bulk',
   ...:       create=True,
   ...:       poll_wait=2
   ...:   )

In [3]:   primary = Queue(
   ...:       queue_name='primary',
   ...:       bulk_queue=bulk,
   ...:       drain=True,
   ...:       create=True,
   ...:       poll_wait=2
   ...:   )

In [5]:   primary.publish('{"type": "priority", "id": 1}')
   ...:   bulk.publish('{"type": "bulk", "id": 1}')
   ...:   bulk.publish('{"type": "bulk", "id": 2}')

In [6]:   for msg in primary:
   ...:       print(msg)

{'type': 'priority', 'id': 1}
{'type': 'bulk', 'id': 1}
{'type': 'bulk', 'id': 2}

bulk_queue_check_pct

When using bulk_queue, the bulk queue is normally only checked when the primary queue is empty. With bulk_queue_check_pct, you can also randomly check the bulk queue after a percentage of non-empty primary queue polls:

primary = Queue(
    queue_name='primary',
    bulk_queue=bulk,
    bulk_queue_check_pct=25
)

This will check the bulk queue after approximately 25% of primary queue polls that returned messages, helping prevent bulk messages from being starved when the primary queue is continuously busy.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

sqs_queue-1.0.0.tar.gz (5.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

sqs_queue-1.0.0-py3-none-any.whl (6.1 kB view details)

Uploaded Python 3

File details

Details for the file sqs_queue-1.0.0.tar.gz.

File metadata

  • Download URL: sqs_queue-1.0.0.tar.gz
  • Upload date:
  • Size: 5.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.9

File hashes

Hashes for sqs_queue-1.0.0.tar.gz
Algorithm Hash digest
SHA256 967404f0a3b106cb24127d3774d659d55afede40237329139f6cff0ce8ef9218
MD5 11d398d0f6ad2acc442aff4882d91893
BLAKE2b-256 cdee367ee4c0a6a54b1458a1e4fefef96ed252d593ea89e10724cb7c55b34630

See more details on using hashes here.

File details

Details for the file sqs_queue-1.0.0-py3-none-any.whl.

File metadata

  • Download URL: sqs_queue-1.0.0-py3-none-any.whl
  • Upload date:
  • Size: 6.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.9

File hashes

Hashes for sqs_queue-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 9ef458ca872d317bed4eed41b3817b14f6ff40c7ebe061587ad6590844233a50
MD5 4e5643ee3d27468338139d15bbeef0f7
BLAKE2b-256 20a755f03cfc0b59a389abb8b5a5c0fd9208085f1481c53974d18ba9b475a9ba

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page