Skip to main content

`ecoindex-cli` is a CLI tool that let you make ecoindex tests on given pages

Project description

Ecoindex-Cli

Quality check

This tool provides an easy way to analyze websites with Ecoindex from your local computer. You have the ability to:

  • Make the analysis on multiple pages
  • Define multiple screen resolution
  • Make a recursive analysis from a given website

This CLI is built on top of ecoindex-python.

The output is always a CSV file with the results of the analysis.

Requirements

  • Python ^3.8
  • Poetry
  • Google Chrome installed on your computer

Setup

 git clone git@github.com:cnumr/ecoindex_cli.git    # Clone source cd ecoindex_cli                                    # Go to source folder poetry install                                     # Install dependencies

Use case

The cli gets 2 commands: analyze and report which can be used separately:

 ecoindex-cli --help                                Usage: ecoindex-cli [OPTIONS] COMMAND [ARGS]...

  Ecoindex cli to make analysis of webpages

Options:
  --install-completion [bash|zsh|fish|powershell|pwsh]
                                  Install completion for the specified shell.
  --show-completion [bash|zsh|fish|powershell|pwsh]
                                  Show completion for the specified shell, to
                                  copy it or customize the installation.

  --help                          Show this message and exit.

Commands:
  analyze  Make an ecoindex analysis of given webpages or website.
  report   If you already performed an ecoindex analysis and have your...

Make a simple analysis

You give just one web url

 ecoindex-cli analyze --url http://www.ecoindex.fr
There are 1 url(s), do you want to process? [Y/n]:
1 urls for 1 window size
Processing  [####################################]  100%
🙌️ File output/www.ecoindex.fr/2021-04-20 16:44:33.468755/results.csv written !

This makes an analysis with a screen resolution of 1920x1080px by default

Set the output file

You can define the csv output file

 ecoindex-cli analyze --url http://www.ecoindex.fr --output-file ~/ecoindex-results/ecoindex.csv
📁️ Urls recorded in file `input/www.ecoindex.fr.csv`
There are 1 url(s), do you want to process? [Y/n]: 
1 urls for 1 window size
Processing  [####################################]  100%
🙌️ File /home/vvatelot/ecoindex-results/ecoindex.csv written !

Multiple url analysis

 ecoindex-cli analyze --url http://www.ecoindex.fr --url https://www.greenit.fr/
There are 2 url(s), do you want to process? [Y/n]:
2 urls for 1 window size
Processing  [####################################]  100%
🙌️ File output/www.ecoindex.fr/2021-04-20 16:45:24.458052/results.csv written !

Provide urls from a file

You can use a file with given urls that you want to analyze: One url per line. This is helpful if you want to play the same scenario recurrently.

 ecoindex-cli analyze --urls-file input/ecoindex.csv
There are 2 url(s), do you want to process? [Y/n]:
2 urls for 1 window size
Processing  [####################################]  100%
🙌️ File output/www.ecoindex.fr/2021-04-20 16:45:24.458052/results.csv written !

Make a recursive analysis

You can make a recursive analysis of a given webiste. This means that the app will try to find out all the pages into your website and launch an analysis on all those web pages. ⚠️ This can process for a very long time! Use it at your own risks!

 ecoindex-cli analyze --url http://www.ecoindex.fr --recursive
⏲️ Crawling root url http://www.ecoindex.fr -> Wait a minute !
📁️ Urls recorded in file `input/www.ecoindex.fr.csv`
There are 3 url(s), do you want to process? [Y/n]:
3 urls for 1 window size
Processing  [####################################]  100%
🙌️ File output/www.ecoindex.fr/2021-04-20 16:47:29.072472/results.csv written !

Set other screen resolutions

You can provide other screen resolutions. By default, the screen resolution is 1920x1080px but you can provide other resolution for example if you want to test ecoindex for mobile.

 ecoindex-cli analyze --url http://www.ecoindex.fr --window-size 1920,1080 --window-size 386,540
There are 1 url(s), do you want to process? [Y/n]:
1 urls for 2 window size
Processing  [####################################]  100%
🙌️ File output/www.ecoindex.fr/2021-04-21 21:22:44.309077/results.csv written !

Generate a html report

You can generate a html report easily at the end of the analysis. You just have to add the option --html-report.

 ecoindex-cli analyze --url http://www.ecoindex.fr --recursive --html-report
⏲️ Crawling root url http://www.ecoindex.fr -> Wait a minute !
📁️ Urls recorded in file `input/www.ecoindex.fr.csv`
There are 3 url(s), do you want to process? [Y/n]:
3 urls for 1 window size
Processing  [####################################]  100%
🙌️ File output/www.ecoindex.fr/2021-04-21 21:21:27.629691/results.csv written !
🦄️ Amazing! A report has been generated to `/home/vvatelot/Devel/ecoindex_cli/output/www.ecoindex.fr/2021-04-21 21:21:27.629691/report.html`

Here is a sample result: Sample report

Only generate a report from existing result file

If you already performed an anlayzis and (for example), forgot to generate the html report, you do not need to re-run a full analyzis, you can simply request a report from your result file :

 ecoindex-cli report "/home/vvatelot/Devel/ecoindex_cli/output/www.ecoindex.fr/2021-05-06 19:13:55.735935/results.csv" "www.synchrone.fr"
🦄️ Amazing! A report has been generated to `/home/vvatelot/Devel/ecoindex_cli/output/www.ecoindex.fr/2021-05-06 19:13:55.735935/report.html`

Results example

The result of the analysis is a CSV file which can be easily used for further analysis:

size,nodes,requests,grade,score,ges,water,url,date,resolution,page_type
119.095,45,8,A,89,1.22,1.83,http://www.ecoindex.fr,2021-04-20 16:45:28.570179,"1920,1080",
769.252,730,94,D,41,2.18,3.27,https://www.greenit.fr/,2021-04-20 16:45:32.199242,"1920,1080",website

Where:

  • size is the size of the page and of the downloaded elements of the page in KB
  • nodes is the number of the DOM elements in the page
  • requests is the number of external requests made by the page
  • grade is the corresponding ecoindex grade of the page (from A to G)
  • score is the corresponding ecoindex score of the page (0 to 100)
  • ges is the equivalent of greenhouse gases emission (in gCO2e) of the page
  • wateris the equivalent water consumption (in cl) of the page
  • url is the analysed page url
  • date is the datetime of the page analysis
  • resolution is the screen resolution used for the page analysis (width,height)
  • page_type is the type of the page, based ton the opengraph type tag

Testing

We use Pytest to run unit tests for this project. The test suite are in the tests folder. Just execute :

pytest --cov-report term-missing:skip-covered --cov=. --cov-config=.coveragerc tests

This runs pytest and also generate a coverage report (terminal and html)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ecoindex-cli-1.10.0.tar.gz (12.2 kB view details)

Uploaded Source

Built Distribution

ecoindex_cli-1.10.0-py3-none-any.whl (11.6 kB view details)

Uploaded Python 3

File details

Details for the file ecoindex-cli-1.10.0.tar.gz.

File metadata

  • Download URL: ecoindex-cli-1.10.0.tar.gz
  • Upload date:
  • Size: 12.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.1.4 CPython/3.8.10 Linux/5.8.0-63-generic

File hashes

Hashes for ecoindex-cli-1.10.0.tar.gz
Algorithm Hash digest
SHA256 692372fb8e81a54898f62fe92a90105c59a46f2f9158d6de1af3265362dcc2b7
MD5 f6cda9c13d4779c024c4aa738b0c8626
BLAKE2b-256 a44e20777b14373f7a9fc0e37ce05f34588dbf1e8de03eceddae9c1985d27a3f

See more details on using hashes here.

File details

Details for the file ecoindex_cli-1.10.0-py3-none-any.whl.

File metadata

  • Download URL: ecoindex_cli-1.10.0-py3-none-any.whl
  • Upload date:
  • Size: 11.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.1.4 CPython/3.8.10 Linux/5.8.0-63-generic

File hashes

Hashes for ecoindex_cli-1.10.0-py3-none-any.whl
Algorithm Hash digest
SHA256 b89f1aa18a55c9f52f26d5344e31d781ff9061df4da4cddaa3715c59e5fdc020
MD5 e9dfed4fd41db25751b031f07d522021
BLAKE2b-256 e0ec76ca00245d4a7d29b4988de7350b644580ea21aed072aa5c72afbc3576a2

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page