Full-featured web UI for Scrapyd cluster management, Scrapy log analysis & visualization
Project description
ScrapydWeb: Full-featured web UI for Scrapyd cluster management, Scrapy log analysis & visualization
==========================
[](https://pypi.org/project/scrapydweb/)
[](https://pypi.org/project/scrapydweb/)
[](https://coveralls.io/github/my8100/scrapydweb?branch=master)
[](https://github.com/my8100/scrapydweb/blob/master/LICENSE)
[](https://twitter.com/intent/tweet?text=ScrapydWeb:%20Full-featured%20web%20UI%20for%20Scrapyd%20cluster%20management,%20Scrapy%20log%20analysis%20%26%20visualization%20%23python%20%20%23scrapy%20%23scrapyd%20%23webscraping%20%23scrapydweb%20@my8100_%20&url=https%3A%2F%2Fgithub.com%2Fmy8100%2Fscrapydweb)
Features
---------------
- Scrapyd Cluster Management
- Group, filter and select any numbers of nodes
- Execute command on multinodes with one click
- Scrapy Log Analysis
- Stats collection
- Progress visualization
- Logs categorization
- All Scrapyd API Supported
- Deploy project, Run Spider, Stop job
- List projects/versions/spiders/running_jobs
- Delete version/project
- Others
- Auto eggifying
- Basic auth for web UI
- Accessing Scrapyd servers protected by basic auth
Devolopers
---------------
- [my8100](https://github.com/my8100)
- [simplety](https://github.com/simplety) (Front-End)
Installation
------------
To install ScrapydWeb, simply use pip:
```
pip install scrapydweb
```
Start Up
------------
1. Run `scrapydweb -h` to get help,
and a config file named **scrapydweb_settings_vN.py** (N for a number) would be copied to current working directory,
then you can custom settings in it.
2. Run `scrapydweb`
3. Visit [http://127.0.0.1:5000](http://127.0.0.1:5000) **(It's recommended to use Google Chrome to get better experience.)**
Settings
---------------
[default_settings.py](https://github.com/my8100/scrapydweb/blob/master/scrapydweb/default_settings.py)
Screenshots
------------
- Overview

- Dashboard

- Log Analysis
- Stats collection

- Progress visualization

- Logs categorization

- Deploy a Project

- Run a Spider

- Manage Projects

==========================
[](https://pypi.org/project/scrapydweb/)
[](https://pypi.org/project/scrapydweb/)
[](https://coveralls.io/github/my8100/scrapydweb?branch=master)
[](https://github.com/my8100/scrapydweb/blob/master/LICENSE)
[](https://twitter.com/intent/tweet?text=ScrapydWeb:%20Full-featured%20web%20UI%20for%20Scrapyd%20cluster%20management,%20Scrapy%20log%20analysis%20%26%20visualization%20%23python%20%20%23scrapy%20%23scrapyd%20%23webscraping%20%23scrapydweb%20@my8100_%20&url=https%3A%2F%2Fgithub.com%2Fmy8100%2Fscrapydweb)
Features
---------------
- Scrapyd Cluster Management
- Group, filter and select any numbers of nodes
- Execute command on multinodes with one click
- Scrapy Log Analysis
- Stats collection
- Progress visualization
- Logs categorization
- All Scrapyd API Supported
- Deploy project, Run Spider, Stop job
- List projects/versions/spiders/running_jobs
- Delete version/project
- Others
- Auto eggifying
- Basic auth for web UI
- Accessing Scrapyd servers protected by basic auth
Devolopers
---------------
- [my8100](https://github.com/my8100)
- [simplety](https://github.com/simplety) (Front-End)
Installation
------------
To install ScrapydWeb, simply use pip:
```
pip install scrapydweb
```
Start Up
------------
1. Run `scrapydweb -h` to get help,
and a config file named **scrapydweb_settings_vN.py** (N for a number) would be copied to current working directory,
then you can custom settings in it.
2. Run `scrapydweb`
3. Visit [http://127.0.0.1:5000](http://127.0.0.1:5000) **(It's recommended to use Google Chrome to get better experience.)**
Settings
---------------
[default_settings.py](https://github.com/my8100/scrapydweb/blob/master/scrapydweb/default_settings.py)
Screenshots
------------
- Overview

- Dashboard

- Log Analysis
- Stats collection

- Progress visualization

- Logs categorization

- Deploy a Project

- Run a Spider

- Manage Projects

Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
scrapyd_UI-0.9.9.tar.gz
(617.7 kB
view details)
File details
Details for the file scrapyd_UI-0.9.9.tar.gz.
File metadata
- Download URL: scrapyd_UI-0.9.9.tar.gz
- Upload date:
- Size: 617.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: Python-urllib/3.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
822a77550b7b91c89559cef29988111bfae5b8b265ea12ec1bf29badb845d0b7
|
|
| MD5 |
26e8e59451e88058af468f1ba32dce7c
|
|
| BLAKE2b-256 |
7efcaa69ba9c80d45c2e27793fcbe3b318a008c45487efd2d26d3ae468e5cb8d
|