Skip to main content

Scrapy download handler that routes requests through rnet for browser-grade TLS/HTTP2 fingerprinting

Project description

scrapy-rnet

A Scrapy download handler that routes all HTTP/HTTPS requests through rnet, giving your spiders browser-grade TLS and HTTP/2 fingerprints via BoringSSL impersonation.

Without this, Scrapy uses Python's standard urllib3/twisted stack, which produces a fingerprint trivially identifiable as a bot. With scrapy-rnet, requests look indistinguishable from a real Chrome, Firefox, or Safari browser at the TLS and HTTP/2 layer.

Requirements

  • Python 3.13+
  • Scrapy 2.14+
  • rnet 2.4+

Installation

uv add scrapy-rnet
# or
pip install scrapy-rnet

Setup

Add to your Scrapy settings.py:

TWISTED_REACTOR = "twisted.internet.asyncioreactor.AsyncioSelectorReactor"

DOWNLOAD_HANDLERS = {
    "http":  "scrapy_rnet.RnetDownloadHandler",
    "https": "scrapy_rnet.RnetDownloadHandler",
}

That's it. All requests will now go through rnet, impersonating Chrome 131 by default.

Configuration

Setting Type Default Description
RNET_IMPERSONATE rnet.Impersonate Chrome131 Browser profile to impersonate
RNET_IMPERSONATE_OS rnet.ImpersonateOS None OS to pair with the browser profile
RNET_TIMEOUT int 30 Request timeout in seconds
RNET_FOLLOW_REDIRECTS bool False Let rnet follow redirects (disables Scrapy's RedirectMiddleware for these requests)
RNET_VERIFY_SSL bool True Verify TLS certificates
RNET_PROXIES list[rnet.Proxy] None Proxy list; takes precedence over Scrapy's proxy settings

Choosing a browser profile

import rnet

# Chrome (default)
RNET_IMPERSONATE = rnet.Impersonate.Chrome131

# Firefox
RNET_IMPERSONATE = rnet.Impersonate.Firefox133

# Safari
RNET_IMPERSONATE = rnet.Impersonate.Safari18

# Pair with a specific OS fingerprint
RNET_IMPERSONATE    = rnet.Impersonate.Chrome131
RNET_IMPERSONATE_OS = rnet.ImpersonateOS.Windows

Full list of available profiles: rnet docs.

Proxies

import rnet

RNET_PROXIES = [rnet.Proxy.http("http://user:pass@proxy.example.com:8080")]

How it works

RnetDownloadHandler implements Scrapy's download handler interface (download_request / close). When Scrapy resolves a request for an http or https URL it calls download_request, which:

  1. Translates the Scrapy Request (method, URL, headers, body) into an rnet call
  2. Sends the request through rnet's BoringSSL-backed async client
  3. Converts the rnet response back into the appropriate Scrapy response subclass (HtmlResponse, TextResponse, etc.), preserving status, headers, body, IP address, and HTTP protocol version

A single rnet.Client instance is shared for the spider's lifetime, so connection pooling works as normal.

Testing

# Unit tests only (no network required)
uv run pytest tests/ -m "not integration"

# All tests including real network calls
uv run pytest tests/

The integration suite includes a fingerprint verification test that hits tls.peet.ws and asserts the TLS/HTTP2 fingerprint matches Chrome 131.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

scrapy_impersonate_rnet-0.1.0.tar.gz (41.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

scrapy_impersonate_rnet-0.1.0-py3-none-any.whl (5.2 kB view details)

Uploaded Python 3

File details

Details for the file scrapy_impersonate_rnet-0.1.0.tar.gz.

File metadata

File hashes

Hashes for scrapy_impersonate_rnet-0.1.0.tar.gz
Algorithm Hash digest
SHA256 b45d562a5136e4834ae29897780b19a9e4eb81739cc8bf1d80bdf9f342dfc2e2
MD5 1e672b3280ef4fd5ed1f1c55ef802387
BLAKE2b-256 863c7bad67ebdef6c216aa50833d99ce5dd62affec2ae71033b3b8430825d7de

See more details on using hashes here.

File details

Details for the file scrapy_impersonate_rnet-0.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for scrapy_impersonate_rnet-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 72be8e889281f1c8a3974139b5c1c9d9d70c7a88e90d5bcd6266783f8517b68c
MD5 6d022f89aa4ae5af4ba8e75a82e81dad
BLAKE2b-256 44324ebd396c11481e331976b6017cbc28cc02c954b7e18eea197750b7682cbb

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page