Skip to main content

Python library for scraping daily electricity prices from OTE (ote-cr.cz)

Project description

python-ote

Electricity prices scraper for OTE (ote-cr.cz)

Install

pip install python-ote

In order to parse numbers corrently (Czech notation - e.g. 1,000,000) this package needs the cs_CZ.UTF-8 system locale. If the OS doesn't have it by default the following commands can be used generate it:

echo "cs_CZ.UTF-8 UTF-8" >> /etc/locale.gen
locale-gen

Usage

from ote import Ote
from dateutil import parser

# Create client
ote = Ote()

Use getDayMarketPrices(date_from, date_to) method to get electricity prices for the given time range. It accepts a date_from and optionally a date_to, both of which have to be a datetime.date object. If date_to is not specified the method returns data to today.

Examples:

# Get water consumption data from the specified date to now.
date_from = parser.parse('2020-08-01').date()
deferred_data = ote.getDayMarketPrices(date_from);

# Get water consumption data for a date interval
date_from = parser.parse('2020-08-01').date()
date_to = parser.parse('2020-08-11').date()
deferred_data = ote.getDayMarketPrices(date_from, date_to);

# Get water consumption data for a specific date (just 1 day)
date = parser.parse('2020-08-01').date()
deferred_data = ote.getDayMarketPrices(date, date);

You may call getDayMarketPrices multiple times with different parameters. It returns a twisted.internet.defer.Deferred object that can be used to retrieve the price data in the future using a callback you need to provide.

def process_prices(prices)
  print(prices)

deferred_data.addCallback(process_prices)

If you have multiple Deferreds from multiple calls to getDayMarketPrices you can use Ote.join() to get a Deferred that will be resolved after all crawlers are finished.

The last callback should stop the reactor so it's shut down cleanly. Reactor should be stopped after all crawlers are done so the join() method comes in handy. Note that the reactor cannot be restarted so make sure this is the last thing you do:

from twisted.internet import reactor

d = ote.join()
d.addBoth(lambda _: reactor.stop())

The last thing you need to do is run the reactor. The script will block until the crawling is finished and all configured callbacks executed.

reactor.run(installSignalHandlers=False)

This might look a bit daunting so please see test.py for a complete example.

Keep in mind the library is using Scrapy internally which means it is scraping the OTE website. If OTE comes to think you are abusing the website they may block your IP address.

License

See LICENSE.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

python_ote-0.2.2.tar.gz (10.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

python_ote-0.2.2-py3-none-any.whl (11.3 kB view details)

Uploaded Python 3

File details

Details for the file python_ote-0.2.2.tar.gz.

File metadata

  • Download URL: python_ote-0.2.2.tar.gz
  • Upload date:
  • Size: 10.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.0

File hashes

Hashes for python_ote-0.2.2.tar.gz
Algorithm Hash digest
SHA256 f1b54e2202209e328200a65aef56591afe92d0d531418f4850b6ed1e7d217a62
MD5 ea0759c7ef00a25b40a26dcee4653dfd
BLAKE2b-256 6f16850406a031ffbe4b2e6cab44db0d4f9b4a251ee06540a8565da044f43b77

See more details on using hashes here.

File details

Details for the file python_ote-0.2.2-py3-none-any.whl.

File metadata

  • Download URL: python_ote-0.2.2-py3-none-any.whl
  • Upload date:
  • Size: 11.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.0

File hashes

Hashes for python_ote-0.2.2-py3-none-any.whl
Algorithm Hash digest
SHA256 d93afc1e1fa37838add93ed29cdb28b89f08dc82db73c6f06b5c268d6eef8fd3
MD5 a9fbcb1a64036f6e4c7b4da5aecc867e
BLAKE2b-256 b8a11a1b4eb8f31cf3025fc5b5cb4de13cb547b5e532d9d1a9b1c27f27a26b82

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page