Skip to main content

Tools for DALIA's data model for open educational resources

Project description

DALIA Interaction Format (DIF)

Tests PyPI PyPI - Python Version PyPI - License Documentation Status Codecov status Cookiecutter template from @cthoyt Ruff Contributor Covenant DOI

The DALIA Interaction Format (DIF) v1.3 is the data model and CSV-based input format for open educational resources (OERs) in the DALIA OER platform.

This repository contains an implementation of the data model in Pydantic, a workflow for serializing to RDF based on pydantic-metamodel, a CSV reader, and a command line validator.

A tutorial/guide for curating OERs in a tabular form (CSV) can be found here.

graph LR
    er[Educational Resource] -- "supported by (0..*)" --> community[Community]
    er -- "recommended by by (0..*)" --> community
    er -- "has author (1..*)" --> author[Author]
    er -- "has discipline (0..*)" --> discipline[Discipline]
    er -- "type (1)" --> lrt[Learning Resource Type]
    er -- "has media type (0..*)" --> mediatype[Media Type]
    er -- "has target group (0..*)" --> tg[Target Group]
    er -- "requires (0..*)" --> pl[Proficiency Level]

💪 Getting Started

The dalia_dif command line tool can be used from the console with to validate CSV files (both local and remote).

$ dalia_dif validate https://raw.githubusercontent.com/NFDI4BIOIMAGE/training/refs/heads/main/docs/export/DALIA_training_materials.csv

Serialize to RDF with dalia_dif convert. It guesses the format based on the file extension, right now .ttl and .jsonl are supported.

$ dalia_dif convert -o output.ttl https://raw.githubusercontent.com/NFDI4BIOIMAGE/training/refs/heads/main/docs/export/DALIA_training_materials.csv
$ dalia_dif convert -o output.jsonl https://raw.githubusercontent.com/NFDI4BIOIMAGE/training/refs/heads/main/docs/export/DALIA_training_materials.csv

Using the data model:

from pydantic_metamodel.api import PredicateObject

from dalia_dif.dif13.model import AuthorDIF13, EducationalResourceDIF13, OrganizationDIF13
from dalia_dif.dif13.picklists import (
    MEDIA_TYPES,
    PROFICIENCY_LEVELS,
    RELATED_WORKS_RELATIONS,
    TARGET_GROUPS,
    LEARNING_RESOURCE_TYPES,
)
from dalia_dif.namespace import DALIA_COMMUNITY, HSFS

resource = EducationalResourceDIF13(
    uuid="b37ddf6e-f136-4230-8418-faf18c4c34d2",
    title="Chemotion ELN Instruction Videos",
    description="Chemotion ELN Instruction Videos Chemotion[1] is an open source "
                "system for storing and managing experiments and molecular data in "
                "chemistry and its related sciences.",
    links=["https://doi.org/10.5281/zenodo.7634481"],
    authors=[
        AuthorDIF13(given_name="Fabian", family_name="Fink", orcid="0000-0002-1863-2087"),
        AuthorDIF13(given_name="Salim", family_name="Benjamaa", orcid="0000-0001-6215-6834"),
        AuthorDIF13(given_name="Nicole", family_name="Parks", orcid="0000-0002-6243-2840"),
        AuthorDIF13(
            given_name="Alexander", family_name="Hoffmann", orcid="0000-0002-9647-8839"
        ),
        AuthorDIF13(
            given_name="Sonja", family_name="Herres-Pawlis", orcid="0000-0002-4354-4353"
        ),
    ],
    license="https://creativecommons.org/licenses/by/4.0",
    supporting_communities=[],
    recommending_communities=[
        DALIA_COMMUNITY["bead62a8-c3c2-46d6-9eb1-ffeaba38d5bf"],  # NFDI4Chem
    ],
    disciplines=[
        HSFS["n40"],  # chemistry
    ],
    file_formats=[
        ".mp4",
    ],
    keywords=["research data management", "NFDI", "RDM", "FDM", "NFDI4Chem", "Chemotion"],
    languages=["eng"],
    learning_resource_types=[
        LEARNING_RESOURCE_TYPES["tutorial"],
    ],
    media_types=[
        MEDIA_TYPES["video"],
    ],
    proficiency_levels=[
        PROFICIENCY_LEVELS["novice"],
    ],
    publication_date="2023-02-13",
    target_groups=[
        TARGET_GROUPS["student (ba)"],
    ],
    related_works=[
        PredicateObject(
            predicate=RELATED_WORKS_RELATIONS["isTranslationOf"],
            object="https://id.dalia.education/learning-resource/20be255e-e2da-4f9c-90b3-5573d6a12619",
        )
    ],
    file_size="703.2 MB",
    version=None,
)
turtle_str = resource.model_dump_turtle()

🚀 Installation

The most recent release can be installed from PyPI with uv:

$ uv pip install dalia_dif

or with pip:

$ python3 -m pip install dalia_dif

The most recent code and data can be installed directly from GitHub with uv:

$ uv pip install git+https://github.com/data-literacy-alliance/dalia-dif.git

or with pip:

$ python3 -m pip install git+https://github.com/data-literacy-alliance/dalia-dif.git

👐 Contributing

Contributions, whether filing an issue, making a pull request, or forking, are appreciated. See CONTRIBUTING.md for more information on getting involved.

👋 Attribution

⚖️ License

The code in this package is licensed under the MIT License.

📖 Citation

An abstract describing the DIF has been published in the proceedings of the 2nd Conference on Research Data Infrastructure (CoRDI).

@misc{steiner2025,
    author = {Steiner, Petra C. and Geiger, Jonathan D. and Fuhrmans, Marc and Amer Desouki, Abdelmoneim and Hüppe, Henrika M.},
    title = {The Revised DALIA Interchange Format - New Picklists for Describing Open Educational Resources},
    month = aug,
    year = 2025,
    publisher = {Zenodo},
    doi = {10.5281/zenodo.16736170},
    url = {https://doi.org/10.5281/zenodo.16736170},
}

🎁 Support

This project has been supported by the following organizations (in alphabetical order):

💰 Funding

This project has been supported by the following grants:

Funding Body Program Grant Number
German Federal Ministry of Research, Technology, and Space (BMFTR) 16DWWQP07
EU Capacity Building and Resilience Facility 16DWWQP07

🍪 Cookiecutter

This package was created with @audreyfeldroy's cookiecutter package using @cthoyt's cookiecutter-snekpack template.

🛠️ For Developers

See developer instructions

The final section of the README is for if you want to get involved by making a code contribution.

Development Installation

To install in development mode, use the following:

$ git clone git+https://github.com/data-literacy-alliance/dalia-dif.git
$ cd dalia-dif
$ uv pip install -e .

Alternatively, install using pip:

$ python3 -m pip install -e .

🥼 Testing

After cloning the repository and installing tox with uv tool install tox --with tox-uv or python3 -m pip install tox tox-uv, the unit tests in the tests/ folder can be run reproducibly with:

$ tox -e py

Additionally, these tests are automatically re-run with each commit in a GitHub Action.

📖 Building the Documentation

The documentation can be built locally using the following:

$ git clone git+https://github.com/data-literacy-alliance/dalia-dif.git
$ cd dalia-dif
$ tox -e docs
$ open docs/build/html/index.html

The documentation automatically installs the package as well as the docs extra specified in the pyproject.toml. sphinx plugins like texext can be added there. Additionally, they need to be added to the extensions list in docs/source/conf.py.

The documentation can be deployed to ReadTheDocs using this guide. The .readthedocs.yml YAML file contains all the configuration you'll need. You can also set up continuous integration on GitHub to check not only that Sphinx can build the documentation in an isolated environment (i.e., with tox -e docs-test) but also that ReadTheDocs can build it too.

🧑‍💻 For Maintainers

See maintainer instructions

Initial Configuration

Configuring ReadTheDocs

ReadTheDocs is an external documentation hosting service that integrates with GitHub's CI/CD. Do the following for each repository:

  1. Log in to ReadTheDocs with your GitHub account to install the integration at https://readthedocs.org/accounts/login/?next=/dashboard/
  2. Import your project by navigating to https://readthedocs.org/dashboard/import then clicking the plus icon next to your repository
  3. You can rename the repository on the next screen using a more stylized name (i.e., with spaces and capital letters)
  4. Click next, and you're good to go!

Configuring Archival on Zenodo

Zenodo is a long-term archival system that assigns a DOI to each release of your package. Do the following for each repository:

  1. Log in to Zenodo via GitHub with this link: https://zenodo.org/oauth/login/github/?next=%2F. This brings you to a page that lists all of your organizations and asks you to approve installing the Zenodo app on GitHub. Click "grant" next to any organizations you want to enable the integration for, then click the big green "approve" button. This step only needs to be done once.
  2. Navigate to https://zenodo.org/account/settings/github/, which lists all of your GitHub repositories (both in your username and any organizations you enabled). Click the on/off toggle for any relevant repositories. When you make a new repository, you'll have to come back to this

After these steps, you're ready to go! After you make "release" on GitHub (steps for this are below), you can navigate to https://zenodo.org/account/settings/github/repository/data-literacy-alliance/dalia-dif to see the DOI for the release and link to the Zenodo record for it.

Registering with the Python Package Index (PyPI)

The Python Package Index (PyPI) hosts packages so they can be easily installed with pip, uv, and equivalent tools.

  1. Register for an account here
  2. Navigate to https://pypi.org/manage/account and make sure you have verified your email address. A verification email might not have been sent by default, so you might have to click the "options" dropdown next to your address to get to the "re-send verification email" button
  3. 2-Factor authentication is required for PyPI since the end of 2023 (see this blog post from PyPI). This means you have to first issue account recovery codes, then set up 2-factor authentication
  4. Issue an API token from https://pypi.org/manage/account/token

This only needs to be done once per developer.

Configuring your machine's connection to PyPI

This needs to be done once per machine.

$ uv tool install keyring
$ keyring set https://upload.pypi.org/legacy/ __token__
$ keyring set https://test.pypi.org/legacy/ __token__

Note that this deprecates previous workflows using .pypirc.

📦 Making a Release

Uploading to PyPI

After installing the package in development mode and installing tox with uv tool install tox --with tox-uv or python3 -m pip install tox tox-uv, run the following from the console:

$ tox -e finish

This script does the following:

  1. Uses bump-my-version to switch the version number in the pyproject.toml, CITATION.cff, src/dalia_dif/version.py, and docs/source/conf.py to not have the -dev suffix
  2. Packages the code in both a tar archive and a wheel using uv build
  3. Uploads to PyPI using uv publish.
  4. Push to GitHub. You'll need to make a release going with the commit where the version was bumped.
  5. Bump the version to the next patch. If you made big changes and want to bump the version by minor, you can use tox -e bumpversion -- minor after.

Releasing on GitHub

  1. Navigate to https://github.com/data-literacy-alliance/dalia-dif/releases/new to draft a new release
  2. Click the "Choose a Tag" dropdown and select the tag corresponding to the release you just made
  3. Click the "Generate Release Notes" button to get a quick outline of recent changes. Modify the title and description as you see fit
  4. Click the big green "Publish Release" button

This will trigger Zenodo to assign a DOI to your release as well.

Updating Package Boilerplate

This project uses cruft to keep boilerplate (i.e., configuration, contribution guidelines, documentation configuration) up-to-date with the upstream cookiecutter package. Install cruft with either uv tool install cruft or python3 -m pip install cruft then run:

$ cruft update

More info on Cruft's update command is available here.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dalia_dif-0.0.20.tar.gz (41.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

dalia_dif-0.0.20-py3-none-any.whl (52.4 kB view details)

Uploaded Python 3

File details

Details for the file dalia_dif-0.0.20.tar.gz.

File metadata

  • Download URL: dalia_dif-0.0.20.tar.gz
  • Upload date:
  • Size: 41.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.28 {"installer":{"name":"uv","version":"0.9.28","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for dalia_dif-0.0.20.tar.gz
Algorithm Hash digest
SHA256 7d5f2336fa02da0ca2fbe58f2429f0ae8ff063caf17dee44ce6aac375fb91f0c
MD5 cd0c9153a34ca2314b9e2af36ea9f839
BLAKE2b-256 8956869e239d3e41a7a8b7d43689e928d36a80c8288ca4d2e55e11384f6e62f0

See more details on using hashes here.

File details

Details for the file dalia_dif-0.0.20-py3-none-any.whl.

File metadata

  • Download URL: dalia_dif-0.0.20-py3-none-any.whl
  • Upload date:
  • Size: 52.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.28 {"installer":{"name":"uv","version":"0.9.28","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for dalia_dif-0.0.20-py3-none-any.whl
Algorithm Hash digest
SHA256 4a3f3b5eeb822dd2c6cb15021329c1c6b771afe0110b8b79a4a6574354c24fac
MD5 6e79719003d95cba10cbfba3856e0a7a
BLAKE2b-256 f955f7b802db28d0d7f1b590753055f0eb835c640a4d21547d1d4a1498c8f07b

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page