Skip to main content

Utilities for working with the linked data service LINDAS of the Swiss Federal Administration. Includes modules for working with cubes.

Project description

pylindas

About

pylindas is a package to build and publish linked data such as cubes as defined by cube.link, describing a schema to describe structured data from tables in RDF. It allows for an alternative to the Cube-Creator. Currently this project is heavily linked to the LINDAS the Swiss Federal Linked Data Service.

For further information, please refer to our Wiki

Installation

There are two ways to install this package, locally or through the Python Package Index (PyPI).

Locally

Clone this repository and cd into the directory. You can now install this package locally on your machine - we advise to use a virtual environment to avoid conflicts with other projects. Additionally, install all dependencies as described in requirements.txt

pip install -e .
pip install -r requirements.txt

Published Version

You can install this package through pip without cloning the repository.

pip install pylindas

Contributing and Suggestions

If you wish to contribute to this project, feel free to clone this repository and open a pull request to be reviewed and merged.

Alternatively feel free to open an issue with a suggestion on what we could implement. We laid out a rough road map for the features ahead on our Timetable

Functionality and structure

This package consists of multiple sub modules

pycube

To avoid the feeling of a black box, our philosophy is to make the construction of cubes modular. The process will take place in multiple steps, outlined below.

  1. Initialization
from pylindas.pycube import Cube

cube = pycube.Cube(dataframe: pd.Dataframe, cube_yaml: dict, shape_yaml: dict)

This step sets some need background information about the cube up.

  1. Mapping
cube.prepare_data()

Adds observation URIs and applies the mappings as described in the shape yaml.

  1. Write cube:Cube
cube.write_cube()

Writes the cube:Cube.

  1. Write cube:Observation
cube.write_observations()

Writes the cube:Observations and the cube:ObservationSet. The URI for the observations are written as <cube_URI/observations/[list_of_key_dimensions]>. This should avoid the possibilities of conflicts in their uniqueness.

  1. Write cube:ObersvationConstraint
cube.write_shape()

Writes the cube:ObservationConstraint.

The full work-flow

# Write the cube
cube = pycube.Cube(dataframe: pd.DataFrame, cube_yaml: dict, shape_yaml: dict)
cube.prepare_data()
cube.write_cube()
cube.write_observations()
cube.write_shape()

# Upload the cube
cube.upload(endpoint: str, named_graph: str)

For an upload, use cube.upload(endpoint: str, named_graph: str) with the proper endpoint as well as named_graph.

A lindas.ini file is read for this step, containing these information as well as a password. It contains the structure:

[TEST]
endpoint=https://stardog-test.cluster.ldbar.ch
username=a-lindas-user-name
password=something-you-don't-need-to-see;)

With additional information for the other environments.

Command line

If you wish, a command line utility is present, that expects an opinionated way to store the data and the description in a directory. It then helps you to perform common operations.

Directory Layout

The directory should be structured as follows:

  • data.csv: This file contains the observations.
  • description.json or description.yml: This file contains the cube and dimension descriptions.

Command Line Usage

For example, to serialize the data, use:

python cli.py serialize <input_directory> <output_ttl_file>

For additional help and options, you can use:

python cli.py --help

Fetching from data sources

There is the possibility to download datasets from other data sources. Right now, the functionality is basic, but it could be possible in the future to extend it.

  • It supports only datasets coming from data.europa.eu
  • It supports only datasets with a Frictionless datapackage

See Frictionless for more information on Frictionless.

python fetch.py 'https://data.europa.eu/data/datasets/fc49eebf-3750-4c9c-a29e-6696eb644362?locale=en' example/corona/

Examples

Multiple cube example are ready in the example directory.

$ python cli.py example list
corona: Corona Numbers Timeline
kita: Number of kids in day care facilities
wind: Wind turbines  operated WKA per year in Schleswig-Holstein

To load an example in a Fuseki database, you can use the load subcommand of the example command.

$ python cli.py example load kita

There is a start-fuseki command that can be used to start a Fuseki server containing data from the examples.

$ python cli.py example start-fuseki

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pylindas-0.4.6.tar.gz (23.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pylindas-0.4.6-py3-none-any.whl (23.4 kB view details)

Uploaded Python 3

File details

Details for the file pylindas-0.4.6.tar.gz.

File metadata

  • Download URL: pylindas-0.4.6.tar.gz
  • Upload date:
  • Size: 23.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.2

File hashes

Hashes for pylindas-0.4.6.tar.gz
Algorithm Hash digest
SHA256 a94350eb49c067a6fa4b383753855503f757b2ecf31884c2cb4e7c1edea3ef0a
MD5 ef6881b0f517932928352a3ad9df1c90
BLAKE2b-256 cda9c6c63ab68d694d793f3660e02d6a107a62cba56b622aa6a43ee421a09ffd

See more details on using hashes here.

File details

Details for the file pylindas-0.4.6-py3-none-any.whl.

File metadata

  • Download URL: pylindas-0.4.6-py3-none-any.whl
  • Upload date:
  • Size: 23.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.2

File hashes

Hashes for pylindas-0.4.6-py3-none-any.whl
Algorithm Hash digest
SHA256 205ee382afc5ac638edd62368a634bfb120a547b29b7eb63abe2ffb75d96f621
MD5 bb67d232ba7d48a53609c7f2d8ce2a81
BLAKE2b-256 3d0b8d76ccfe118204ff47b6045a2d7e85d5e1556df6a52bdb0f8543b69b9aef

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page