Skip to main content

A full-featured web UI for Scrapyd cluster management, with Scrapy log analysis & visualization supported.

Project description

English | 简体中文

ScrapydWeb: A full-featured web UI for Scrapyd cluster management, with Scrapy log analysis & visualization supported.

PyPI - scrapydweb Version Downloads - total PyPI - Python Version Coverage Status GitHub license Twitter


Scrapyd ScrapydWeb LogParser

Recommended Reading

How to efficiently manage your distributed web scraping projects


View contents
  • Scrapyd Cluster Management

    • All Scrapyd JSON API Supported
    • Group, filter and select any number of nodes
    • Execute command on multinodes with just a few clicks
  • Scrapy Log Analysis

    • Stats collection
    • Progress visualization
    • Logs categorization
  • Enhancements

    • Auto eggify your projects
    • Integrated with LogParser
    • :e-mail: Email notice
    • Mobile UI
    • Basic auth for web UI


Getting Started

View contents


Make sure that Scrapyd has been installed and started on all of your hosts.

Note that for remote access, you have to manually set 'bind_address =' in the configuration file of Scrapyd and restart Scrapyd to make it visible externally.


  • Use pip:
pip install scrapydweb
  • Use git:
git clone
cd scrapydweb
python install


  1. Start ScrapydWeb via command scrapydweb. (a config file would be generated for customizing settings at the first startup.)
  2. Visit (It's recommended to use Google Chrome for a better experience.)

Browser Support

The latest version of Google Chrome, Firefox, and Safari.

Running the tests

View contents
$ git clone
$ cd scrapydweb

# To create isolated Python environments
$ pip install virtualenv
$ virtualenv venv/scrapydweb
# Or specify your Python interpreter: $ virtualenv -p /usr/local/bin/python3.7 venv/scrapydweb
$ source venv/scrapydweb/bin/activate

# Install dependent libraries
(scrapydweb) $ python install
(scrapydweb) $ pip install pytest
(scrapydweb) $ pip install coverage

# Make sure Scrapyd has been installed and started, then update the custom_settings item in tests/
(scrapydweb) $ vi tests/
(scrapydweb) $ curl

(scrapydweb) $ coverage run --source=scrapydweb -m pytest tests/ -s -vv
(scrapydweb) $ coverage run --source=scrapydweb -m pytest tests -s -vv
(scrapydweb) $ coverage report
# To create an HTML report, check out htmlcov/index.html
(scrapydweb) $ coverage html

Built With

View contents


Detailed changes for each release are documented in the






This project is licensed under the GNU General Public License v3.0 - see the LICENSE file for details.

Project details

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Filename, size & hash SHA256 hash help File type Python version Upload date
scrapydweb-1.1.0-py3-none-any.whl (624.7 kB) Copy SHA256 hash SHA256 Wheel py3
scrapydweb-1.1.0.tar.gz (575.0 kB) Copy SHA256 hash SHA256 Source None

Supported by

Elastic Elastic Search Pingdom Pingdom Monitoring Google Google BigQuery Sentry Sentry Error logging AWS AWS Cloud computing DataDog DataDog Monitoring Fastly Fastly CDN SignalFx SignalFx Supporter DigiCert DigiCert EV certificate StatusPage StatusPage Status page