Tools for the microdata.no platform
Project description
microdata-tools
Tools for the microdata.no platform
Installation
microdata-tools can be installed from PyPI using pip:
pip install microdata-tools
Usage
Once you have your metadata and data files ready to go, they should be named and stored like this:
my-input-directory/
MY_DATASET_NAME/
MY_DATASET_NAME.csv
MY_DATASET_NAME.json
The CSV file is optional in some cases.
Package dataset
The package_dataset() function will encrypt and package your dataset as a tar archive. The process is as follows:
- Generate the symmetric key for a dataset.
- Encrypt the dataset data (CSV) using the symmetric key and store the encrypted file as
<DATASET_NAME>.csv.encr - Encrypt the symmetric key using the asymmetric RSA public key
microdata_public_key.pemand store the encrypted file as<DATASET_NAME>.symkey.encr - Gather the encrypted CSV, encrypted symmetric key and metadata (JSON) file in one tar file.
Unpackage dataset
The unpackage_dataset() function will untar and decrypt your dataset using the microdata_private_key.pem
RSA private key.
The packaged file has to have the <DATASET_NAME>.tar extension. Its contents should be as follows:
<DATASET_NAME>.json : Required medata file.
<DATASET_NAME>.csv.encr : Optional encrypted dataset file.
<DATASET_NAME>.symkey.encr : Optional encrypted file containing the symmetrical key used to decrypt the dataset file. Required if the .csv.encr file is present.
Decryption uses the RSA private key located at RSA_KEY_DIR.
The packaged file is then stored in output_dir/archive/unpackaged after a successful run or output_dir/archive/failed after an unsuccessful run.
Example
Python script that uses a RSA public key named microdata_public_key.pem and packages a dataset:
from pathlib import Path
from microdata_tools import package_dataset
RSA_KEYS_DIRECTORY = Path("tests/resources/rsa_keys")
DATASET_DIRECTORY = Path("tests/resources/input_package/DATASET_1")
OUTPUT_DIRECTORY = Path("tests/resources/output")
package_dataset(
rsa_keys_dir=RSA_KEYS_DIRECTORY,
dataset_dir=DATASET_DIRECTORY,
output_dir=OUTPUT_DIRECTORY,
)
Validation
Once you have your metadata and data files ready to go, they should be named and stored like this:
my-input-directory/
MY_DATASET_NAME/
MY_DATASET_NAME.csv
MY_DATASET_NAME.json
Note that the filename only allows upper case letters A-Z, number 0-9 and underscores.
Import microdata-tools in your script and validate your files:
from microdata_tools import validate_dataset
validation_errors = validate_dataset(
"MY_DATASET_NAME",
input_directory="path/to/my-input-directory"
)
if not validation_errors:
print("My dataset is valid")
else:
print("Dataset is invalid :(")
# You can print your errors like this:
for error in validation_errors:
print(error)
For a more in-depth explanation of usage visit the usage documentation.
Data format description
A dataset as defined in microdata consists of one data file, and one metadata file.
The data file is a csv file seperated by semicolons. A valid example would be:
000000000000001;123;2020-01-01;2020-12-31;
000000000000002;123;2020-01-01;2020-12-31;
000000000000003;123;2020-01-01;2020-12-31;
000000000000004;123;2020-01-01;2020-12-31;
Read more about the data format and columns in the documentation.
The metadata files should be in json format. The requirements for the metadata is best described through the Pydantic model, the examples, and the metadata model.
Contribute
Set up
To work on this repository you need to install uv:
# macOS / linux / BashOnWindows
curl -LsSf https://astral.sh/uv/install.sh | sh
# Windows powershell
powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"
Then install the virtual environment from the root directory:
uv sync
Running unit tests
Open terminal and go to root directory of the project and run:
uv run pytest
Pre-commit
There are currently 3 active rules: Ruff-format, Ruff-lint and sync lock file. Install pre-commit
pip install pre-commit
If you've made changes to the pre-commit-config.yaml or its a new project install the hooks with:
pre-commit install
Now it should run when you do:
git commit
By default it only runs against changed files. To force the hooks to run against all files:
pre-commit run --all-files
if you dont have it installed on your system you can use: (but then it won't run when you use the git-cli)
uv run pre-commit
Read more about pre-commit
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file microdata_tools-1.10.8.tar.gz.
File metadata
- Download URL: microdata_tools-1.10.8.tar.gz
- Upload date:
- Size: 39.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
a57f50348b2c7b36a8330307b53021f5112b787f468a738822aef5c3bd2300f3
|
|
| MD5 |
9d50b29dc5d3690819df6880442eacc6
|
|
| BLAKE2b-256 |
1a67ab4e3dacf4600969f624726c9407e32c058a31d4aca335f632ea1282abac
|
Provenance
The following attestation bundles were made for microdata_tools-1.10.8.tar.gz:
Publisher:
test-and-publish.yaml on statisticsnorway/microdata-tools
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
microdata_tools-1.10.8.tar.gz -
Subject digest:
a57f50348b2c7b36a8330307b53021f5112b787f468a738822aef5c3bd2300f3 - Sigstore transparency entry: 1090845188
- Sigstore integration time:
-
Permalink:
statisticsnorway/microdata-tools@0a650004b9f34c07ba6f0fcbae879f373aa73437 -
Branch / Tag:
refs/heads/main - Owner: https://github.com/statisticsnorway
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
test-and-publish.yaml@0a650004b9f34c07ba6f0fcbae879f373aa73437 -
Trigger Event:
push
-
Statement type:
File details
Details for the file microdata_tools-1.10.8-py3-none-any.whl.
File metadata
- Download URL: microdata_tools-1.10.8-py3-none-any.whl
- Upload date:
- Size: 56.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
f136662fad04418ccfaede03bad2e751c0a4e205777d03c781757b8c093c2d36
|
|
| MD5 |
d15adb381c4e3ef255ccd402cebe6e7f
|
|
| BLAKE2b-256 |
b1676d88c84318296f1c8ba9be01accecc69b58daa307bee9a72768db4abe3c6
|
Provenance
The following attestation bundles were made for microdata_tools-1.10.8-py3-none-any.whl:
Publisher:
test-and-publish.yaml on statisticsnorway/microdata-tools
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
microdata_tools-1.10.8-py3-none-any.whl -
Subject digest:
f136662fad04418ccfaede03bad2e751c0a4e205777d03c781757b8c093c2d36 - Sigstore transparency entry: 1090845210
- Sigstore integration time:
-
Permalink:
statisticsnorway/microdata-tools@0a650004b9f34c07ba6f0fcbae879f373aa73437 -
Branch / Tag:
refs/heads/main - Owner: https://github.com/statisticsnorway
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
test-and-publish.yaml@0a650004b9f34c07ba6f0fcbae879f373aa73437 -
Trigger Event:
push
-
Statement type: