Skip to main content

Selenium-based web scraper to extract data from E-REDES website and load it into database storage.

Project description

E-REDES Scraper

Description

This is a web scraper that collects data from the E-REDES website and stores it in a database. Since there is no exposed interface to the data, the web scraper is the only approach available to collect it. A high-level of the process is:

  1. The scraper collects the data from the E-REDES website.
  2. A file with the energy consumption readings is downloaded.
  3. The file is parsed and the data is compared to the data in the database to determine if there are new readings.
  4. If there are new readings, they are stored in the database.

This package supports E-REDES website available at time of writing 23/10/2023. The entrypoint for the scraper is the page https://balcaodigital.e-redes.pt/login.

Installation

The package can be installed using pip:

pip install eredesscraper

Configuration

Usage is based on a YAML configuration file.
A config.yml is used to specify the credentials for the E-REDES website and [Optionally] the database connection. Currently, only InfluxDB is supported as a database sink.

Template config.yml:

eredes:
  # eredes credentials
  nif: <my-eredes-nif>
  pwd: <my-eredes-password>
  # CPE to monitor. e.g. PT00############04TW (where # is a digit). CPE can be found in your bill details
  cpe: PT00############04TW


influxdb:
  # url to InfluxDB.  e.g. http://localhost or https://influxdb.my-domain.com
  host: http://localhost
  # default port is 8086
  port: 8086
  bucket: <my-influx-bucket>
  org: <my-influx-org>
  # access token with write access
  token: <token>

Usage

Python script:

from eredesscraper.workflows import switchboard
from pathlib import Path

switchboard(name="current_month",
            db="influxdb",
            config_path=Path("./config.yml"))

CLI:

ers config load "/path/to/config.yml"

ers run

Limitations

Available workflows:

  • current_month: Collects the current month consumption.
  • previous_month: Collects the previous month consumption data.
  • select_month: Collects the consumption data from an arbitrary month parsed by the user.

Available databases:

Roadmap

  • Add workflow for retrieving previous month data.
  • Add workflow for retrieving data form an arbitrary month.
  • Build CLI.
  • Containerize app.
  • Documentation.
  • Add CI/CD.
  • Add logging.
  • Add tests.
  • Add runtime support for multiple CPEs.

Contributing

Pull requests are welcome. For major changes, please open an issue first to discuss what you would like to change.

License

See LICENSE file.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

eredesscraper-0.1.7.tar.gz (44.1 kB view details)

Uploaded Source

Built Distribution

eredesscraper-0.1.7-py3-none-any.whl (21.6 kB view details)

Uploaded Python 3

File details

Details for the file eredesscraper-0.1.7.tar.gz.

File metadata

  • Download URL: eredesscraper-0.1.7.tar.gz
  • Upload date:
  • Size: 44.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: poetry/1.7.0 CPython/3.11.6 Linux/6.2.0-1015-azure

File hashes

Hashes for eredesscraper-0.1.7.tar.gz
Algorithm Hash digest
SHA256 64a2e7b1c09d2eab697f76ea5860d2c3768f7ddf99cb9ca857ef7e89dc86bb2a
MD5 f252c6624a241ea9e6cb2527fab887ed
BLAKE2b-256 30d1403d8a2b5586950a21a46a3291bfc925031722813c6a3c396818046b9271

See more details on using hashes here.

File details

Details for the file eredesscraper-0.1.7-py3-none-any.whl.

File metadata

  • Download URL: eredesscraper-0.1.7-py3-none-any.whl
  • Upload date:
  • Size: 21.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: poetry/1.7.0 CPython/3.11.6 Linux/6.2.0-1015-azure

File hashes

Hashes for eredesscraper-0.1.7-py3-none-any.whl
Algorithm Hash digest
SHA256 e8858bffdb1a489062066f6326a9b195859586e16cb35393cf262e1f34fbd6fb
MD5 987ec6fc47b9f0ae3e5720ede171b3e8
BLAKE2b-256 1ff357ecd1f00390dd69b933be396fea2a0a2f26f92be6d081d05c0f526466ff

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page