Skip to main content

Data packaging tools for the HEAL data ecosystem

Project description

--8<-- [start:intro]

HEAL Data Utilities

The HEAL Data Utilities python package provides data packaging tools for the HEAL Data Ecosystem to facilitate data discovery, sharing, and harmonization with a focus on the HEAL Platform Data Consultancy (DSC).

Currently, the focus of this repository is generating data dictionaries (see Variable-level Metadata section below). However, in the future, this will be expanded for all HEAL-specific data packaging functions (e.g., study- and file-level metadata and data).

Installation

To install the latest official release of healdata-utils, from your computer's command prompt, run:

pip install healdata-utils --pre (NOTE: currently in pre-lease)

pip install git+https://github.com/norc-heal/healdata-utils.git

--8<-- [end:intro]

--8<-- [start:vlmd-intro]

Variable-level Metadata (Data Dictionaries)

Binder

The healdata-utils variable-level metadata (vlmd) tool inputs a variety of different input file types and exports HEAL-compliant data dictionaries (JSON and CSV formats). Additionally, exported validation (i.e., "error") reports provide the user information as to a) if the exported data dictionary is valid according to HEAL specifications and b) how to modify one's data dictionary to make it HEAL-compliant.

--8<-- [end:vlmd-intro]

--8<-- [start:vlmd-basic-usage]

Basic usage

The vlmd tool can be used via python or the command line.

Using from python

From your current working directory in python, run:

from healdata_utils.cli import convert_to_vlmd

# description and title are optional. If submitting through platform, can fill these out there.
description = "This is a proof of concept to demonstrate the healdata-utils functionality"
title = "Healdata-utils Demonstration Data Dictionary"
healdir = "output" # can also specify a file name if desired (eg output/thisismynewdd.csv)
inputpath = "input/my-redcap-data-dictionary-export.csv"

data_dictionaries = convert_to_vlmd(
    filepath=inputpath,
    outputdir=healdir, 
    inputtype=input_type, #if not specified, looks for suffix
    data_dictionary_props={"title":title,"description":description} #data_dictionary_props is optional
)

This will output the data dictionaries to the specified output directory (see output section below) and also save the json/csv versions in the data_dictionaries object.

For the available input file formats (i.e., the available choices for the inputtype parameter), one can run (from python):

from healdata_utils.cli import input_descriptions

input_descriptions

The input_descriptions object contains the choice for inputtype as the key and the description as the value.

Using from the command line

From your current working directory run: (note the \ at the end of each line signals a line continuation for ease in understanding the long, one-line command.) Again, the --title and --description options are optional. For descriptions on the different flags/options, run vlmd --help

vlmd --filepath "data/example_pyreadstat_output.sav" \
--outputdir "output-cli" \
--title "Healdata-utils Demonstration Data Dictionary" \
--description "This is a proof of concept to demonstrate the healdata-utils functionality" 

Output

Both the python and command line routes will result in a JSON and CSV version of the HEAL data dictionary in the output folder along with the validation reports in the errors folder. See below:

  • input/input/my-redcap-data-dictionary-export.csv : your input file

  • output/errors/heal-csv-errors.json: outputted validation report for table in csv file against frictionless schema

  • output/errors/heal-json-errors.json: outputted jsonschema validation report.

!!! important The main difference* between the CSV and JSON data dictionary validation lies in the way the data dictionaries are structured and the additional metadata included in the JSON data dictionary.

The CSV data dictionary is a plain tabular representation with no additional metadata, while the JSON dataset includes fields along with additional metadata in the form of a root description and title.

* for field-specific differences, see the schemas in the documentation. 
  • output/heal-csvtemplate-data-dictionary.csv: This is the CSV data dictionary
  • output/heal-jsontemplate-data-dictionary.json: This is the JSON version of the data dictionary

Note, only the JSON version will have the user-specified title and description

Interactive notebooks

See the below notebooks demonstrating use and workflows using the convert_to_vlmd in python and vlmd in the command line.

Clicking on the "binder badges" will bring you to an interactive notebook page where you can test out the notebooks. Here, healdata-utils comes pre-installed.

  1. Generating a heal data dictionary from a variety of input files
  1. [in development] Creating and iterating over a csv data dictionary to create a valid data dictionary file click here

--8<-- [end:vlmd-basic-usage]

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

healdata_utils-0.1.4a0.tar.gz (27.9 kB view details)

Uploaded Source

Built Distribution

healdata_utils-0.1.4a0-py3-none-any.whl (30.7 kB view details)

Uploaded Python 3

File details

Details for the file healdata_utils-0.1.4a0.tar.gz.

File metadata

  • Download URL: healdata_utils-0.1.4a0.tar.gz
  • Upload date:
  • Size: 27.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.17

File hashes

Hashes for healdata_utils-0.1.4a0.tar.gz
Algorithm Hash digest
SHA256 0401a02a19920319c5e9d3319dd780aba602cb4a559ea2766811da43f25c1e8a
MD5 f3e87b7c71fbde2bafc728c61f3b390a
BLAKE2b-256 06e17f1b5faf206e48be4a687276fbb1db582674e5ef1261a10b4a617226f1e4

See more details on using hashes here.

File details

Details for the file healdata_utils-0.1.4a0-py3-none-any.whl.

File metadata

File hashes

Hashes for healdata_utils-0.1.4a0-py3-none-any.whl
Algorithm Hash digest
SHA256 8919a434eaaba8ae03bd91591ca3277032bd268673619c337caad88dac19db05
MD5 e0c9eaef5f620d3926301634abfb507f
BLAKE2b-256 4dcf72decf9485d104ce3b718f9d752ec6adb31c63a7f3e7c6c3fc4e386dd5df

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page