Skip to main content

Helpers and examples to build Scrapy Crawlers in a test driven way.

Project description Test Coverage


Helpers and examples to build Scrapy Crawlers in a test driven way.

Motivation / Why should I develop Scrapy Crawlers using TDD?

  1. The develop - test cycle goes down to a few seconds and so it allows you to get a properly working scraper up much faster

  2. When bugs are discovered in “the wild” with real data, new example files, a test and a fix can be created and tested much faster

  3. It allows for fast refactoring without breaking anything - which results in much cleaner scraper code

  4. It just feels right when you are used to be doing TDD

What’s the difference to Scrapy’s Spiders Contracts?

Scrapy has its own builtin testing feature named Spiders Contracts

I tried to use them for some time, but decided to build real unit tests in a unit test framework like py.test because of these shortcomings:

  • its philosophy is geared towards testing against contracts (thus the name) that by nature are more broad and less specific concepts. Testing for exact field contents in items can be done, but is difficult and fragile

  • its documentation and basic set of features is a bit thin

  • it mixes implementation code with contract descriptions which is only usable when there are few and simple contracts


pip install scrapy_tdd

Quick Start Examples

def describe_fancy_spider():

to_test = MySpider().from_crawler(get_crawler())

def describe_parse_suggested_terms():

resp = response_from(“Result_JSON_Widget.txt”) results = to_test.parse(resp)

def should_get_item():

item = results assert item[0][“lorem”] == ‘ipsum’ assert item[0][“iterem”] == “ipsem”

Full Documentation

… coming soon …

Missing / next steps

  • Mocking Request-Response pairs

How to contribute

… coming soon …

Project details

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

scrapy_tdd-0.1.5.tar.gz (5.2 kB view hashes)

Uploaded source

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page