Skip to main content

Distributed web crawling with browsers

Project description

“browser” | “crawler” = “brozzler”

Brozzler is a distributed web crawler (爬虫) that uses a real browser (Chrome or Chromium) to fetch pages and embedded URLs and to extract links. It employs yt-dlp (formerly youtube-dl) to enhance media capture capabilities and rethinkdb to manage crawl state.

Brozzler is designed to work in conjunction with warcprox for web archiving.

Requirements

  • Python 3.9 or later

  • RethinkDB deployment

  • Chromium or Google Chrome >= version 64

Note: The browser requires a graphical environment to run. When brozzler is run on a server, this may require deploying some additional infrastructure, typically X11. Xvnc4 and Xvfb are X11 variants that are suitable for use on a server, because they don’t display anything to a physical screen. The vagrant configuration in the brozzler repository has an example setup using Xvnc4. (When last tested, chromium on Xvfb did not support screenshots, so Xvnc4 is preferred at this time.)

Getting Started

The simplest way to get started with Brozzler is to use the brozzle-page command-line utility to pass in a single URL to crawl. You can also add a new job defined with a YAML file (see job-const.rst) and start a local Brozzler worker for a more complex crawl.

Mac instructions:

# install and start rethinkdb
brew install rethinkdb
# no brew? try rethinkdb's installer: https://www.rethinkdb.com/docs/install/osx/
rethinkdb &>>rethinkdb.log &

# optional: create a virtualenv
python -m venv .venv

# install brozzler with rethinkdb extra
pip install brozzler[rethinkdb]

# crawl a single site
brozzle-page https://example.org

# or enqueue a job and start brozzler-worker
brozzler-new-job job1.yml
brozzler-worker

At this point Brozzler will start archiving your site.

Running Brozzler locally in this manner demonstrates the full Brozzler archival crawling workflow, but does not take advantage of Brozzler’s distributed nature.

Installation and Usage

To install brozzler only:

pip install brozzler  # in a virtualenv if desired

Launch one or more workers: [*]

brozzler-worker --warcprox-auto

Submit jobs:

brozzler-new-job myjob.yaml

Submit sites not tied to a job:

brozzler-new-site --time-limit=600 https://example.org/

Job Configuration

Brozzler jobs are defined using YAML files. Options may be specified either at the top-level or on individual seeds. At least one seed URL must be specified, however everything else is optional. For details, see job-conf.rst.

id: myjob
time_limit: 60 # seconds
ignore_robots: false
warcprox_meta: null
metadata: {}
seeds:
  - url: https://one.example.org/
  - url: https://two.example.org/
    time_limit: 30
  - url: https://three.example.org/
    time_limit: 10
    ignore_robots: true
    scope:
      surt: https://(org,example,

Brozzler Dashboard

Brozzler comes with a rudimentary web application for viewing crawl job status. To install the brozzler with dependencies required to run this app, run

pip install brozzler[dashboard]

To start the app, run

brozzler-dashboard

At this point Brozzler Dashboard will be accessible at http://localhost:8000/.

Brozzler-Dashboard.png

See brozzler-dashboard --help for configuration options.

License

Copyright 2015-2025 Internet Archive

Licensed under the Apache License, Version 2.0 (the “License”); you may not use this software except in compliance with the License. You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an “AS IS” BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

brozzler-1.9.3.tar.gz (104.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

brozzler-1.9.3-py3-none-any.whl (100.7 kB view details)

Uploaded Python 3

File details

Details for the file brozzler-1.9.3.tar.gz.

File metadata

  • Download URL: brozzler-1.9.3.tar.gz
  • Upload date:
  • Size: 104.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: uv/0.11.6 {"installer":{"name":"uv","version":"0.11.6","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for brozzler-1.9.3.tar.gz
Algorithm Hash digest
SHA256 6373cadd011b3aa8c4674e368f7560d7044d756bd26baff91e675d50fed5d827
MD5 b454882e27a9a0808c668d8cd0881238
BLAKE2b-256 ddc829eecb85857b9e93bf477e20c3edc708ea8369a540e147c7f0cdbfbc898a

See more details on using hashes here.

File details

Details for the file brozzler-1.9.3-py3-none-any.whl.

File metadata

  • Download URL: brozzler-1.9.3-py3-none-any.whl
  • Upload date:
  • Size: 100.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: uv/0.11.6 {"installer":{"name":"uv","version":"0.11.6","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for brozzler-1.9.3-py3-none-any.whl
Algorithm Hash digest
SHA256 7e5e7600cf0ed10bee3527d4b25673df13eec2ad452ea8187c2254363bcc11e2
MD5 0d4c313539849d6e7fbeee42192c323b
BLAKE2b-256 7a45afdcfdb45cd9b1e5f1110468a635b47ee31a88c7adafb7d27b3408d0f551

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page