Skip to main content

Scrapy extension for outputting scraped items to an Amazon SQS instance

Project description

Build Status Coveralls Status Requirements Status


This is an extension to Scrapy to allow exporting of scraped items to an Amazon SQS instance.


After installing the package, the two classes defined in the library need to be added to the relevant sections of the settings file:

  'sqs': 'sqsfeedexport.SQSExporter'

  'sqs': 'sqsfeedexport.SQSFeedStorage'

The FEED_STORAGES section uses a URL prefixed with sqs to differentiate it from other URI based storage options.

In the environment we also need to define some keys:


The AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY are the AWS credentials to be used, and AWS_DEFAULT_REGION is the region to default to for the SQS instance. FEED_URI is the name of the AWS SQS instance in the AWS_DEFAULT_REGION region for example:


would refer to a queue name bar in the us-east-1 region.

Finally, the FEED_FORMAT option makes the Scrapy spiders use the SQSExporter class.

Project details

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Files for scrapy-sqs-exporter, version 1.1.0
Filename, size File type Python version Upload date Hashes
Filename, size scrapy_sqs_exporter-1.1.0-py2.py3-none-any.whl (3.4 kB) File type Wheel Python version py2.py3 Upload date Hashes View
Filename, size scrapy-sqs-exporter-1.1.0.tar.gz (3.3 kB) File type Source Python version None Upload date Hashes View

Supported by

Pingdom Pingdom Monitoring Google Google Object Storage and Download Analytics Sentry Sentry Error logging AWS AWS Cloud computing DataDog DataDog Monitoring Fastly Fastly CDN DigiCert DigiCert EV certificate StatusPage StatusPage Status page