Skip to main content

HDX Data Freshness

Project description

HDX Data Freshness

Build Status Coverage Status

The implementation of HDX freshness in Python reads all the datasets from the Humanitarian Data Exchange website (using the HDX Python library) and then iterates through them one by one performing a sequence of steps.

  1. It gets the dataset’s update frequency if it has one. If that update frequency is Never, then the dataset is always fresh.

  2. If not, it checks if the dataset and resource metadata have changed - this qualifies as an update from a freshness perspective. It compares the difference between the current time and update time with the update frequency and sets a status: fresh, due, overdue or delinquent.

  3. If the dataset is not fresh based on metadata, then the urls of the resources are examined. If they are internal urls (data.humdata.org - the HDX filestore, manage.hdx.rwlabs.org - CPS) then there is no further checking that can be done because when the files pointed to by these urls update, the HDX metadata is updated.

  4. If they are urls with an adhoc update frequency (proxy.hxlstandard.org, ourairports.com), then freshness cannot be determined. Currently, there is no mechanism in HDX to specify adhoc update frequencies, but there is a proposal to add this to the update frequency options. At the moment, the freshness value for adhoc datasets is based on whatever has been set for update frequency, but these datasets can be easily identified and excluded from results if needed.

  5. If the url is externally hosted and not adhoc, then we can open an HTTP GET request to the file and check the header returned for the Last-Modified field. If that field exists, then we read the date and time from it and check if that is more recent than the dataset or resource metadata modification date. If it is, we recalculate freshness.

  6. If the resource is not fresh by this measure, then we download the file and calculate an MD5 hash for it. In our database, we store previous hash values, so we can check if the hash has changed since the last time we took the hash.

  7. There are some resources where the hash changes constantly because they connect to an api which generates a file on the fly. To identify these, we hash again and check if the hash changes in the few seconds since the previous hash calculation.

Since there can be temporary connection and download issues with urls, the code has multiple retry functionality with increasing delays. Also as there are many requests to be made, rather than perform them one by one, they are executed concurrently using the asynchronous functionality that has been added to the most recent versions of Python.

Usage

python run.py

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hdx-data-freshness-0.99.1.tar.gz (15.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

hdx_data_freshness-0.99.1-py2.py3-none-any.whl (30.5 kB view details)

Uploaded Python 2Python 3

File details

Details for the file hdx-data-freshness-0.99.1.tar.gz.

File metadata

File hashes

Hashes for hdx-data-freshness-0.99.1.tar.gz
Algorithm Hash digest
SHA256 ce8f4263575fcd54393fb2844ee1ff8c8078d6204f788b8e5adc9a658b4051a5
MD5 2f316bb88c2d884130fa78e5e9df792c
BLAKE2b-256 dfa82cfd6c1817fbbd5cbc8c5cf513404f2abeff23e28930cb28c7ca6fccc119

See more details on using hashes here.

File details

Details for the file hdx_data_freshness-0.99.1-py2.py3-none-any.whl.

File metadata

File hashes

Hashes for hdx_data_freshness-0.99.1-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 db652fc10d5004f2697eb97c5d559f4bf010f8c53b31d2596830111e9adc0ce6
MD5 af5bf6fd0a62e90236aa6cd4d2d00a7b
BLAKE2b-256 1161481fc4e82709a5bf42ac46ec454e603d829ff3091b03d02cb35fb614375c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page