Skip to main content

Scrapy download handler that routes requests through rnet for browser-grade TLS/HTTP2 fingerprinting

Project description

scrapy-rnet

A Scrapy download handler that routes all HTTP/HTTPS requests through rnet, giving your spiders browser-grade TLS and HTTP/2 fingerprints via BoringSSL impersonation.

Without this, Scrapy uses Python's standard urllib3/twisted stack, which produces a fingerprint trivially identifiable as a bot. With scrapy-rnet, requests look indistinguishable from a real Chrome, Firefox, or Safari browser at the TLS and HTTP/2 layer.

Requirements

  • Python 3.13+
  • Scrapy 2.14+
  • rnet 2.4+

Installation

uv add scrapy-rnet
# or
pip install scrapy-rnet

Setup

Add to your Scrapy settings.py:

TWISTED_REACTOR = "twisted.internet.asyncioreactor.AsyncioSelectorReactor"

DOWNLOAD_HANDLERS = {
    "http":  "scrapy_rnet.RnetDownloadHandler",
    "https": "scrapy_rnet.RnetDownloadHandler",
}

That's it. All requests will now go through rnet, impersonating Chrome 131 by default.

Configuration

Setting Type Default Description
RNET_IMPERSONATE rnet.Impersonate Chrome131 Browser profile to impersonate
RNET_IMPERSONATE_OS rnet.ImpersonateOS None OS to pair with the browser profile
RNET_TIMEOUT int 30 Request timeout in seconds
RNET_FOLLOW_REDIRECTS bool False Let rnet follow redirects (disables Scrapy's RedirectMiddleware for these requests)
RNET_VERIFY_SSL bool True Verify TLS certificates
RNET_PROXIES list[rnet.Proxy] None Proxy list; takes precedence over Scrapy's proxy settings

Choosing a browser profile

import rnet

# Chrome (default)
RNET_IMPERSONATE = rnet.Impersonate.Chrome131

# Firefox
RNET_IMPERSONATE = rnet.Impersonate.Firefox133

# Safari
RNET_IMPERSONATE = rnet.Impersonate.Safari18

# Pair with a specific OS fingerprint
RNET_IMPERSONATE    = rnet.Impersonate.Chrome131
RNET_IMPERSONATE_OS = rnet.ImpersonateOS.Windows

Full list of available profiles: rnet docs.

Proxies

import rnet

RNET_PROXIES = [rnet.Proxy.http("http://user:pass@proxy.example.com:8080")]

How it works

RnetDownloadHandler implements Scrapy's download handler interface (download_request / close). When Scrapy resolves a request for an http or https URL it calls download_request, which:

  1. Translates the Scrapy Request (method, URL, headers, body) into an rnet call
  2. Sends the request through rnet's BoringSSL-backed async client
  3. Converts the rnet response back into the appropriate Scrapy response subclass (HtmlResponse, TextResponse, etc.), preserving status, headers, body, IP address, and HTTP protocol version

A single rnet.Client instance is shared for the spider's lifetime, so connection pooling works as normal.

Testing

# Unit tests only (no network required)
uv run pytest tests/ -m "not integration"

# All tests including real network calls
uv run pytest tests/

The integration suite includes a fingerprint verification test that hits tls.peet.ws and asserts the TLS/HTTP2 fingerprint matches Chrome 131.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

scrapy_impersonate_rnet-0.1.1.tar.gz (41.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

scrapy_impersonate_rnet-0.1.1-py3-none-any.whl (5.2 kB view details)

Uploaded Python 3

File details

Details for the file scrapy_impersonate_rnet-0.1.1.tar.gz.

File metadata

File hashes

Hashes for scrapy_impersonate_rnet-0.1.1.tar.gz
Algorithm Hash digest
SHA256 82cce788c0a12b79ad83f64ee04cd36dcbbe19bf5abf8979595bf6c6066e595b
MD5 9da062c42e14706ff749bef904457715
BLAKE2b-256 2f01bf385dcfad49c0efc66ad464416396897286db217ca599ec62ee7be238df

See more details on using hashes here.

File details

Details for the file scrapy_impersonate_rnet-0.1.1-py3-none-any.whl.

File metadata

File hashes

Hashes for scrapy_impersonate_rnet-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 b0d318d5ed726932a32bf8dfd4c1569cba7157748fe6d239609355dfa1e3d115
MD5 96dcfe3340f4cf5bae84c7074c4299e8
BLAKE2b-256 278ae5021f63ba97d34ecc145161de9da5b972f7ef66cca2fac7a6d3888fcac7

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page