`ecoindex-cli` is a CLI tool that let you make ecoindex tests on given pages
Project description
Ecoindex-Cli
This tool provides an easy way to analyze websites with Ecoindex from your local computer. You have the ability to:
- Make the analysis on multiple pages
- Define multiple screen resolution
- Make a recursive analysis from a given website
This CLI is built on top of ecoindex-python with Typer
The output is always a CSV file with the results of the analysis.
Current limitation: This does not work well with SPA.
Requirements
- Python ^3.8
- pip
Setup
➜ pip install --user -U ecoindex-cli
Use case
The cli gets 2 commands: analyze
and report
which can be used separately:
➜ ecoindex-cli --help Usage: ecoindex-cli [OPTIONS] COMMAND [ARGS]...
Ecoindex cli to make analysis of webpages
Options:
--install-completion [bash|zsh|fish|powershell|pwsh]
Install completion for the specified shell.
--show-completion [bash|zsh|fish|powershell|pwsh]
Show completion for the specified shell, to
copy it or customize the installation.
--help Show this message and exit.
Commands:
analyze Make an ecoindex analysis of given webpages or website.
report If you already performed an ecoindex analysis and have your...
Make a simple analysis
You give just one web url
➜ ecoindex-cli analyze --url http://www.ecoindex.fr
There are 1 url(s), do you want to process? [Y/n]:
1 urls for 1 window size
Processing [####################################] 100%
🙌️ File /tmp/ecoindex-cli/output/www.ecoindex.fr/2021-04-20 16:44:33.468755/results.csv written !
This makes an analysis with a screen resolution of 1920x1080px by default
Set the output file
You can define the csv output file
➜ ecoindex-cli analyze --url http://www.ecoindex.fr --output-file ~/ecoindex-results/ecoindex.csv
📁️ Urls recorded in file `input/www.ecoindex.fr.csv`
There are 1 url(s), do you want to process? [Y/n]:
1 urls for 1 window size
Processing [####################################] 100%
🙌️ File /home/vvatelot/ecoindex-results/ecoindex.csv written !
Multiple url analysis
➜ ecoindex-cli analyze --url http://www.ecoindex.fr --url https://www.greenit.fr/
There are 2 url(s), do you want to process? [Y/n]:
2 urls for 1 window size
Processing [####################################] 100%
🙌️ File /tmp/ecoindex-cli/output/www.ecoindex.fr/2021-04-20 16:45:24.458052/results.csv written !
Provide urls from a file
You can use a file with given urls that you want to analyze: One url per line. This is helpful if you want to play the same scenario recurrently.
➜ ecoindex-cli analyze --urls-file input/ecoindex.csv
There are 2 url(s), do you want to process? [Y/n]:
2 urls for 1 window size
Processing [####################################] 100%
🙌️ File /tmp/ecoindex-cli/output/www.ecoindex.fr/2021-04-20 16:45:24.458052/results.csv written !
Make a recursive analysis
You can make a recursive analysis of a given webiste. This means that the app will try to find out all the pages into your website and launch an analysis on all those web pages. ⚠️ This can process for a very long time! Use it at your own risks!
➜ ecoindex-cli analyze --url http://www.ecoindex.fr --recursive
⏲️ Crawling root url http://www.ecoindex.fr -> Wait a minute !
📁️ Urls recorded in file `/tmp/ecoindex-cli/input/www.ecoindex.fr.csv`
There are 3 url(s), do you want to process? [Y/n]:
3 urls for 1 window size
Processing [####################################] 100%
🙌️ File /tmp/ecoindex-cli/output/www.ecoindex.fr/2021-04-20 16:47:29.072472/results.csv written !
Set other screen resolutions
You can provide other screen resolutions. By default, the screen resolution is 1920x1080px
but you can provide other resolution for example if you want to test ecoindex for mobile.
➜ ecoindex-cli analyze --url http://www.ecoindex.fr --window-size 1920,1080 --window-size 386,540
There are 1 url(s), do you want to process? [Y/n]:
1 urls for 2 window size
Processing [####################################] 100%
🙌️ File /tmp/ecoindex-cli/output/www.ecoindex.fr/2021-04-21 21:22:44.309077/results.csv written !
Generate a html report
You can generate a html report easily at the end of the analysis. You just have to add the option --html-report
.
➜ ecoindex-cli analyze --url http://www.ecoindex.fr --recursive --html-report
⏲️ Crawling root url http://www.ecoindex.fr -> Wait a minute !
📁️ Urls recorded in file `input/www.ecoindex.fr.csv`
There are 3 url(s), do you want to process? [Y/n]:
3 urls for 1 window size
Processing [####################################] 100%
🙌️ File output/www.ecoindex.fr/2021-04-21 21:21:27.629691/results.csv written !
🦄️ Amazing! A report has been generated to `/tmp/ecoindex-cli/output/www.ecoindex.fr/2021-04-21 21:21:27.629691/report.html`
Here is a sample result:
Only generate a report from existing result file
If you already performed an anlayzis and (for example), forgot to generate the html report, you do not need to re-run a full analyzis, you can simply request a report from your result file :
➜ ecoindex-cli report "/tmp/ecoindex-cli/output/www.ecoindex.fr/2021-05-06 19:13:55.735935/results.csv" "www.synchrone.fr"
🦄️ Amazing! A report has been generated to `/tmp/ecoindex-cli/output/www.ecoindex.fr/2021-05-06 19:13:55.735935/report.html`
Results example
The result of the analysis is a CSV file which can be easily used for further analysis:
size,nodes,requests,grade,score,ges,water,url,date,resolution,page_type
119.095,45,8,A,89,1.22,1.83,http://www.ecoindex.fr,2021-04-20 16:45:28.570179,"1920,1080",
769.252,730,94,D,41,2.18,3.27,https://www.greenit.fr/,2021-04-20 16:45:32.199242,"1920,1080",website
Where:
size
is the size of the page and of the downloaded elements of the page in KBnodes
is the number of the DOM elements in the pagerequests
is the number of external requests made by the pagegrade
is the corresponding ecoindex grade of the page (from A to G)score
is the corresponding ecoindex score of the page (0 to 100)ges
is the equivalent of greenhouse gases emission (ingCO2e
) of the pagewater
is the equivalent water consumption (incl
) of the pageurl
is the analysed page urldate
is the datetime of the page analysisresolution
is the screen resolution used for the page analysis (width,height
)page_type
is the type of the page, based ton the opengraph type tag
Testing
In order to develop or test, you have to use Poetry, install the dependencies and execute a poetry shell:
poetry install
poetry shell
We use Pytest to run unit tests for this project. The test suite are in the tests
folder. Just execute :
pytest --cov-report term-missing:skip-covered --cov=. --cov-config=.coveragerc tests
This runs pytest and also generate a coverage report (terminal and html)
Contributing
Code of conduct
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file ecoindex-cli-2.1.1.tar.gz
.
File metadata
- Download URL: ecoindex-cli-2.1.1.tar.gz
- Upload date:
- Size: 13.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.1.11 CPython/3.8.2 Linux/5.8.0-1042-azure
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 88992c137491fb11b5fd4e42712c5fdb5af519701606954451b8905845222eb4 |
|
MD5 | 823ce8f516ad8c8d5db65c5384d102b3 |
|
BLAKE2b-256 | 6ccd36efe52226c329ae91b3256be8e4a1ed386c9a10037489bbae770a274c76 |
File details
Details for the file ecoindex_cli-2.1.1-py3-none-any.whl
.
File metadata
- Download URL: ecoindex_cli-2.1.1-py3-none-any.whl
- Upload date:
- Size: 13.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.1.11 CPython/3.8.2 Linux/5.8.0-1042-azure
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 05fe520ce3a1754875828143e64a17e173b3a1d304fadfac45987a495307470a |
|
MD5 | 2c9cd9499c5bf324c3e6a79d7d8fbc81 |
|
BLAKE2b-256 | 32e186a496113d224ea20a7fc4888a3da2f14cdd5cc248cf2a468b55eb8fbfa3 |