Skip to main content

Scrapy spider for TW Rental House

Project description

TW Rental House Utility for Scrapy

This package is built for crawling Taiwanese rental house related website using Scrapy. As behaviour of crawlers may differ from their goal, scale, and pipeline, this package provides only minimum feature set, which allow developer to list and decode a rental house web page into structured data, without knowing too much about detail HTML and API structure of each website. In addition, this package is also designed for extensibility, which allow developers to insert customized callback, manipulate data, and integrate with existing crawler structure.

Although this package provide the ability to crawl rental house website, it's developer's responsibility to ensure crawling mechanism and usage of data. Please be friendly to target website, such as consider using DOWNLOAD_DELAY or AUTO_THROTTLING to prevent bulk requesting.

Requirement

  1. Python 3.10+

Installation

poetry add scrapy-tw-rental-house

Basic Usage

This package currently support 591. Each rental house website is a Scrapy Spider class. You can either crawl entire website using default setting , which will take couple days, or customize the behaviour base on your need.

The most basic usage would be creating a new Spider class that inherit Rental591Spider:

from scrapy_twrh.spiders.rental591 import Rental591Spider

class MyAwesomeSpider(Rental591Spider):
    name='awesome'

And than start crawling by

scrapy crawl awesome

Please see example for detail usage.

Items

All spiders populates 2 type of Scrapy items: GenericHouseItem and RawHouseItem.

GenericHouseItem contains normalized data field, spirders from different website will decode their data and fit into this schema in best effort.

RawHouseItem contains unnormalized data field, which keep original and structured data in best effort.

Note that both item are super set of schema. It developer's responsibility to check which field is provided when receiving an item. For example, in Rental591Spider, for a single rental house, Scrapy will get:

  1. 1x RawHouseItem + 1x GenericHouseItem during listing all houses, which provide only minimun data field for GenericHouseItem
  2. 1x RawHouseItem + 1x GenericHouseItem during retrieving house detail.

Handlers

All spiders in this package provide the following handlers:

  1. start_list, similiar to start_requests in Scrapy, control how crawler issue search/list request to find all rental houses.
  2. parse_list, similiar to parse in Scrapy, control how crawler handles response from start_list and generate request for detail house info page.
  3. parse_detail, control how crawler parse detail page.

All spiders implements their own default handler, say, default_start_list, default_parse_list, and default_parse_detail, and can be overwrite during __init__. Please see example for how to control spider behavior using handlers.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

scrapy_tw_rental_house-1.3.6.tar.gz (20.4 kB view details)

Uploaded Source

Built Distribution

scrapy_tw_rental_house-1.3.6-py3-none-any.whl (23.4 kB view details)

Uploaded Python 3

File details

Details for the file scrapy_tw_rental_house-1.3.6.tar.gz.

File metadata

  • Download URL: scrapy_tw_rental_house-1.3.6.tar.gz
  • Upload date:
  • Size: 20.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.7.1 CPython/3.10.12 Linux/6.8.0-40-generic

File hashes

Hashes for scrapy_tw_rental_house-1.3.6.tar.gz
Algorithm Hash digest
SHA256 30b1e9a327f778f0415a3bf1b3616226c60342f5716b393cd8bc044c9ab234ee
MD5 660c4d7c1de01ab52b476bd415f08129
BLAKE2b-256 85bbf43dc8776241839bdce9f6aadcfb8980d06281863345a71815c6d5a5dd55

See more details on using hashes here.

File details

Details for the file scrapy_tw_rental_house-1.3.6-py3-none-any.whl.

File metadata

File hashes

Hashes for scrapy_tw_rental_house-1.3.6-py3-none-any.whl
Algorithm Hash digest
SHA256 0b07e38a5c9bb3f63b0061d5de5783a8416aeb940a1be27e171eda75cf4991dc
MD5 7ff703972f559a6f4a54f908f1a5a957
BLAKE2b-256 e28b585c6b1523b63d6c62e977b13c22a15721e2ead90aff478894fa6798d68d

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page