Skip to main content

Utilities for working with the linked data service LINDAS of the Swiss Federal Archives. Includes modules for working with cubes.

Project description

lindaspy

About

lindaspy is a package to build and publish linked data such as cubes as defined by cube.link, describing a schema to describe structured data from tables in RDF. It allows for an alternative to the Cube-Creator. Currently this project is heavily linked to the LINDAS the Swiss Federal Linked Data Service.

For further information, please refer to our Wiki

Installation

There are two ways to install this package, locally or through the Python Package Index (PyPI).

Locally

Clone this repository and cd into the directory. You can now install this package locally on your machine - we advise to use a virtual environment to avoid conflicts with other projects. Additionally, install all dependencies as described in requirements.txt

pip install -e .
pip install -r requirements.txt

Published Version

You can install this package through pip without cloning the repository.

pip install lindaspy

Contributing and Suggestions

If you wish to contribute to this project, feel free to clone this repository and open a pull request to be reviewed and merged.

Alternatively feel free to open an issue with a suggestion on what we could implement. We laid out a rough road map for the features ahead on our Timetable

Functionality and structure

This package consists of multiple sub modules

pycube

To avoid the feeling of a black box, our philosophy is to make the construction of cubes modular. The process will take place in multiple steps, outlined below.

  1. Initialization
from lindaspy.pycube import Cube

cube = pycube.Cube(dataframe: pd.Dataframe, cube_yaml: dict, shape_yaml: dict)

This step sets some need background information about the cube up.

  1. Mapping
cube.prepare_data()

Adds observation URIs and applies the mappings as described in the shape yaml.

  1. Write cube:Cube
cube.write_cube()

Writes the cube:Cube.

  1. Write cube:Observation
cube.write_observations()

Writes the cube:Observations and the cube:ObservationSet. The URI for the observations are written as <cube_URI/observations/[list_of_key_dimensions]>. This should avoid the possibilities of conflicts in their uniqueness.

  1. Write cube:ObersvationConstraint
cube.write_shape()

Writes the cube:ObservationConstraint.

The full work-flow

# Write the cube
cube = pycube.Cube(dataframe: pd.DataFrame, cube_yaml: dict, shape_yaml: dict)
cube.apply_mapping()
cube.write_cube()
cube.write_observations()
cube.write_shape()

# Upload the cube
cube.upload(endpoint: str, named_graph: str)

For an upload, use cube.upload(endpoint: str, named_graph: str) with the proper endpoint as well as named_graph.

A lindas.ini file is read for this step, containing these information as well as a password. It contains the structure:

[TEST]
endpoint=https://stardog-test.cluster.ldbar.ch
username=a-lindas-user-name
password=something-you-don't-need-to-see;)

With additional information for the other environments.

Command line

If you wish, a command line utility is present, that expects an opinionated way to store the data and the description in a directory. It then helps you to perform common operations.

Directory Layout

The directory should be structured as follows:

  • data.csv: This file contains the observations.
  • description.json or description.yml: This file contains the cube and dimension descriptions.

Command Line Usage

For example, to serialize the data, use:

python cli.py serialize <input_directory> <output_ttl_file>

For additional help and options, you can use:

python cli.py --help

Fetching from data sources

There is the possibility to download datasets from other data sources. Right now, the functionality is basic, but it could be possible in the future to extend it.

  • It supports only datasets coming from data.europa.eu
  • It supports only datasets with a Frictionless datapackage

See Frictionless for more information on Frictionless.

python fetch.py 'https://data.europa.eu/data/datasets/fc49eebf-3750-4c9c-a29e-6696eb644362?locale=en' example/corona/

Examples

Multiple cube example are ready in the example directory.

$ python cli.py example list
corona: Corona Numbers Timeline
kita: Number of kids in day care facilities
wind: Wind turbines  operated WKA per year in Schleswig-Holstein

To load an example in a Fuseki database, you can use the load subcommand of the example command.

$ python cli.py example load kita

There is a start-fuseki command that can be used to start a Fuseki server containing data from the examples.

$ python cli.py example start-fuseki

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pylindas-0.4.2.tar.gz (21.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pylindas-0.4.2-py3-none-any.whl (21.7 kB view details)

Uploaded Python 3

File details

Details for the file pylindas-0.4.2.tar.gz.

File metadata

  • Download URL: pylindas-0.4.2.tar.gz
  • Upload date:
  • Size: 21.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.0.1 CPython/3.13.1

File hashes

Hashes for pylindas-0.4.2.tar.gz
Algorithm Hash digest
SHA256 0edf357a23016c0caa5b1d80c93dfbcacf1dbef6afd774fbf4afd17f727abc29
MD5 d127be57146f9f42befd07bceca2d520
BLAKE2b-256 d50a2ef24a3da94a7e585ca01371952d3f0923daaaa857d578f6bbfe3a5ccf23

See more details on using hashes here.

File details

Details for the file pylindas-0.4.2-py3-none-any.whl.

File metadata

  • Download URL: pylindas-0.4.2-py3-none-any.whl
  • Upload date:
  • Size: 21.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.0.1 CPython/3.13.1

File hashes

Hashes for pylindas-0.4.2-py3-none-any.whl
Algorithm Hash digest
SHA256 1f81c35a41a5577cefab3ab36fe91e0f8b84f6f1de55ad045da2540ad5554a82
MD5 a513b404f89634d9daf62b3ba0338039
BLAKE2b-256 2832f2f6314694c49caf44d0e9940aafd494b0a1e89eb74893a7df9797ba364c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page