Skip to main content

Scrapy HTTP POST items pipeline

Project description

scrapy-http-pipeline

Build Status Requires.io

Just a simple Scrapy HTTP pipeline to POST your items to your server.

Usage

Install

pip install scrapy-http-pipeline

Configure scrapy settings.py

ITEM_PIPELINES = {
    'scrapyhttppipeline.scrapyhttppipeline.HttpPostPipeline': 500
}

# Url to your server, which accepts POST requests
HTTP_POST_PIPELINE_URL = 'http://localhost:8080/items'

# Any custom headers you want to add, e.g. authentication
HTTP_POST_PIPELINE_HEADERS = {
    'X-Authorization': 'xxx'
}

# If you want to send more items at once (and have less HTTP POST requests incoming.)
# If True items will be send as [{key1:val1},{key1:val1}] instead of {key1:val1}
HTTP_POST_PIPELINE_BUFFERED = False
HTTP_POST_PIPELINE_BUFFER_SIZE = 100

Developing

Package requirements are handled using pip. To install them do

pip install -r requirements.txt

Tests

Testing is set up using pytest and coverage is handled with the pytest-cov plugin.

Run your tests with py.test in the root directory.

Coverage is ran by default and is set in the pytest.ini file. To see an html output of coverage open htmlcov/index.html after running the tests.

Travis CI

There is a .travis.yml file that is set up to run your tests for python 2.7 and python 3.3-3.7, should you choose to use it.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

scrapy-http-pipeline-0.2.0.tar.gz (3.6 kB view hashes)

Uploaded Source

Built Distribution

scrapy_http_pipeline-0.2.0-py3-none-any.whl (5.3 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page