Skip to main content

Structured, metadata-enhanced data storage.

Project description

dsch

Introduction

Dsch provides a way to store data and its metadata in a structured, reliable way. It is built upon well-known data storage engines, such as the HDF5 file format, providing performance and long-term stability.

The core feature is the schema-based approach to data storage, which means that a pre-defined schema specification is used to determine:

  • which data fields are available

  • the (hierarchical) structure of data fields

  • metadata of the stored values (e.g. physical units)

  • expected data types and constraints for the stored values

In fact, this is similar to an API specification, but it can be attached to and stored with the data. Programs writing datasets benefit from data validation and the high-level interface. Reading programs can determine the given data’s schema upfront, and process accordingly. This is especially useful with schemas evolving over time.

For persistent storage, dsch supports multiple storage engines via its backends, but all through a single, transparent interface. Usually, there are no client code changes required to support a new backend, and custom backends can easily be added to dsch. Currently, backends exist for these storage engines:

Note that dsch is only a thin layer, so that users can still benefit from the performance of the underlying storage engine. Also, files created with dsch can always be opened directly (i.e. without dsch) and still provide all relevant information, even the metadata!

Reasoning

Dsch is a response to the challenges in low-level data acquisition scenarios, which are commonly found in labs at universities or R&D departments. Frequent changes in both hardware and software are commonplace in these environments, and since those changes are often made by different people, the data acquisition hardware, software and data consumption software tend to get out of sync. At the same time, datasets are often stored (and used!) for many years, which makes backwards-compatibility a significant issue.

Dsch aims to counteract these problems by making the data exchange process more explicit. Using pre-defined schemas ensures backward-compatibility as long as possible, and when it can no longer be retained, provides a clear way to detect (and properly handle) multiple schema versions. Also, schema based validation allows to detect possible errors upfront, so that most non-security-related checks do not have to be re-implemented in data consuming applications.

Note that dsch is targeted primarily at these low-level applications. When using high-level data processing or even data science and machine learning techniques, data is often pre-processed and aggregated with regard to a specific application, which often eliminates the need for some of dsch’s features, such as the metadata storage. One might think of dsch as the tool to handle data before it is filled into something like pandas.

Changelog

This project follows the guidelines of Keep a changelog and adheres to Semantic versioning.

0.2.1 - 2018-02-02

Changed

  • h5py and scipy, needed for HDF5 and MAT file support, respectively, are now listed as extras / optional dependencies in setup.py.

Fixed

  • Fix missing type conversion for Scalar in inmem backend that causes validation to incorrectly fail in some cases.

0.2.0 - 2018-02-01

Added

  • New node type for bytes data.

  • In-memory backend, for handling data without needing e.g. a file on disk.

  • Support for copying data between different storages.

  • Support for creating new storages from existing ones, aka. “save as”.

  • PseudoStorage abstraction class for unified data access in libraries.

  • Human-readable tree-representation of data nodes for use in interactive sessions.

  • Support == operator for schema nodes.

Changed

  • Data nodes in Compilations and Lists can no longer be overwritten accidentally when trying to overwrite their stored value.

  • Improve structure and conciseness of docs.

  • Change List to evaluate empty-ness recursively.

  • Replace generic exceptions like TypeError by custom dsch exceptions.

0.1.3 - 2018-01-11

Changed

  • Attempting to open a non-existent file now shows a sensible error message.

  • Attempting to create an existing file now shows a sensible error message.

Fixed

  • Fix error when handling partially filled compilations.

  • Fix typo in documentation.

0.1.2 - 2017-08-25

Fixed

  • Fix incorrect ordering of list items.

0.1.1 - 2017-06-09

Added

  • Cover additional topics in documentation.

Fixed

  • Fix error when handling single-element lists with mat backend.

0.1.0 - 2017-05-18

Added

  • First preview release.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dsch-0.2.1.tar.gz (54.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

dsch-0.2.1-py3-none-any.whl (37.5 kB view details)

Uploaded Python 3

File details

Details for the file dsch-0.2.1.tar.gz.

File metadata

  • Download URL: dsch-0.2.1.tar.gz
  • Upload date:
  • Size: 54.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No

File hashes

Hashes for dsch-0.2.1.tar.gz
Algorithm Hash digest
SHA256 30f44ef92890275b604a4f0233df673b06742f151e643315cb131fa02d3f78b6
MD5 17be7ebee36d703986514a38f23c25dc
BLAKE2b-256 7753fed78b972000aa9a37ebec81a0e69831172efc759262f76b43d86b2e9e98

See more details on using hashes here.

File details

Details for the file dsch-0.2.1-py3-none-any.whl.

File metadata

  • Download URL: dsch-0.2.1-py3-none-any.whl
  • Upload date:
  • Size: 37.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No

File hashes

Hashes for dsch-0.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 e9bfdd8e3b451a18b9200f945415baab18c2a53b1a3f9387894e993ee3ec0710
MD5 a80d0b328eed3d79ba34ecf03983b244
BLAKE2b-256 e330f38b0809b637dd277b0d58d3b03a69c2059b5b7df6e86550f22094b403ea

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page