Skip to main content

A PySpark package used to expedite and standardise the data linkage process

Project description

DLH_utils

MIT License PyPI version PyPi Python Versions

A Python package produced by the Linkage Development team from the Data Linkage Hub at Office for National Statistics (ONS) containing a set of functions used to expedite and streamline the data linkage process.

It's key features include:

  • it's scalability to large datasets, using spark as a big-data backend
  • profiling and flagging functions used to describe and highlight issues in data
  • standardisation and cleaning functions to make data comparable ahead of linkage
  • linkage functions to derive linkage variables and join data together efficiently

Please log an issue on the issue board or contact any of the active contributors with any issues or suggestions for improvements you have.

Installation steps

DLH_utils supports Python 3.6+. To install the latest version, simply run:

pip install dlh_utils

Or, if using CDSW, in a terminal session run:

pip3 install dlh_utils

The -U argument can be used to upgrade the package to its newest version:

pip3 install -U dlh_utils

Demo

For a worked demonstration notebook of these functions being applied within a data linkage context, head over to our separate demo repository

Common issues

When using the jaro/jaro_winkler functions the error "no module called Jellyfish found" is thrown

These functions are dependent on the Jellyfish package and this may not be installed on the executors used in your spark session. Try submitting Jellyfish to your sparkcontext via addPyFile() or by setting the following environmental variables in your CDSW engine settings (ONS only):

  • PYSPARK_DRIVER_PYTHON = /usr/local/bin/python3.6
  • PYSPARK_PYTHON = /opt/ons/virtualenv/miscMods_v4.04/bin/python3.6

Using the cluster function

The cluster function uses Graphframes, which requires an extra JAR file dependency to be submitted to your spark context in order for it to run.

We have published a graphframes-wrapper package on Pypi that contains this JAR file. This is included in the package requirements as a dependency.

If outside of ONS and this dependency doesn't work, you will need to submit graphframes' JAR file dependency to your spark context. This can be found here:

https://repos.spark-packages.org/graphframes/graphframes/0.6.0-spark2.3-s_2.11/graphframes-0.6.0-spark2.3-s_2.11.jar

Once downloaded, this can be submitted to your spark context by adding this parameter to your SparkSession config:

spark.conf.set('spark.jars', path_to_jar_file)

Thanks

Thanks to all those in the Data Linkage Hub, Data Engineering and Methodology at ONS that have contributed towards this repository.

Any questions?

If you need any additional help, or have any feedback on the package, please contact Jenna Hart (Jenna.Hart@ons.gov.uk) or the Data Linkage Hub at Linkage.Hub@ons.gov.uk .

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dlh_utils-0.3.1.tar.gz (55.6 kB view hashes)

Uploaded Source

Built Distribution

dlh_utils-0.3.1-py3-none-any.whl (59.7 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page