Full-featured web UI for monitoring and controlling your Scrapyd servers
Project description
ScrapydWeb: Full-featured web UI for monitoring and controlling Scrapyd servers
Feature Support
-
Multinode Scrapyd Servers
- Group, filter and select any numbers of nodes
- Execute command on multinodes with one click
-
Scrapy Log Analysis
- Collect statistics
- Show crawling progress with chart
- Extract key logs
-
All Scrapyd API supported
- Deploy project, Run Spider, Stop job
- List projects/versions/spiders/running_jobs
- Delete version/project
Maintainer
Installation
To install ScrapydWeb, simply use pip:
$ pip install scrapydweb
Start Up
Run "scrapydweb -h" to get help, and a config file named "scrapydweb_settings.py" would be copied to the working directory, then you can custom config with it
$ scrapydweb
Visit http://127.0.0.1:5000
Screenshot
-
Overview
-
Dashboard
-
Log Analysis
-
Statistics
-
Crawling progress
-
Key Logs
-
-
Deploy Project
-
Run Spider
-
Manage Projects
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
scrapydweb-0.9.2.tar.gz
(573.6 kB
view hashes)
Built Distribution
scrapydweb-0.9.2-py3-none-any.whl
(596.9 kB
view hashes)
Close
Hashes for scrapydweb-0.9.2-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 6d43f63cd0ba3001dbb51a96855e6216d14ac3ec6ffd3bbc6b70ae32b1783ebe |
|
MD5 | c996e3d6334c98546d857611047438b4 |
|
BLAKE2b-256 | eee3c1a344c85e5299562be9ab171b3f9e0204e7f26231bbd72287f185ecbbbf |