Skip to main content

A polymorphic schema managed semi structured crosslinked data dictionary builder.. BINGO!

Project description

Datatables - structured data library based on schemas

homepage

Dev

  1. poetry install
  2. poetry shell
    -- Make changes --
  3. poetry run pytest
  4. poetry run black bstk_datatables
  5. poetry run flake8 bstk_datatables
    -- Commit & Push --

Install

pip install bstk-datatables

Overview

Datatables act as an intermediary between Marshmallow structures and user defined data storage structures.
It is designed to provide "just enough" sidechannel structure to facilitate building a dynamic schema, (and connecting with "other" interfaces), without losing the advantages afforded by static Marshmallow schemas.

Schema

Schema models are;

  • Schema: A collection of fields and references that make up a partial or complete entry
  • SchemaField: A basic instruction container representing a single value
  • SchemaFieldFormat: The specific instructions for how the field should be collected, represented, formatted and stored
  • SchemaValuesError: The only type of exception raised during schema validation

These schemas and fields are mapped to equivalent Marshmallow structures which provide the entry value validation mechanisms.. ref: Schema.process_values()

Entry

An Entry is a collection of field values, references data, connector references and schema links.

  • .schemata is a list of Schema.code's
  • .table_id is a link back to a Table.uuid
  • .references and .connector_references are unrestricted containers. Two containers are provided to seperate "core" references from "free-form" references.
  • .values is a dict of Field.code => value that conform to the listed schemata

Table

A Table corrals one or more Entry and shapes them towards one or more Schema.

  • .schemata is a list of Schema.code's that all entries must inherit
  • .references and .connectors are unrestricted containers. Two containers are provided to seperate "core" references from "free-form" references (and allows correlation with table entries).

Marshalling and Persistence

All core classes (and Enum) expose an export method which return a dict.
The result of an export() can be unpacked and provided to its constructor.

def test_entry_export():
    data = {
        "uuid": str(uuid4()),
        "table_id": str(uuid4()),
        "name": "Data Entry",
        "references": {"entity_uuid": str(uuid4())},
        "connector_references": {"connector1": "connector_ref"},
        "schemata": ["base"],
        "values": {"base/value1": "XG230"},
    }
    entry = Entry(**data)
    exported = export(entry)
    assert exported == data

The simplest way to handle data persistence is to encapsulate class instanciation and the export method of the relevant class into an ORM or ODM framework.
MergeSchema do not provide an export mechanism because they are not first-class citizens and are designed to work with established Schema structures.

This test provides an example of how to implement persistence with flat files.

Extras

MergedSchema

Tables and Entries support more than a single schema reference.
MergedSchema exists to facilitate mutli-schema validation and field ordering.

Provide Dict[Schema.Code: Schema] as schemata when initialising a MergedSchema and it will:

  1. Process the schema in order
  2. De-dupe fields with the same code (If a later schema includes a field with the same code as a previously loaded schema - that field will be skipped)
  3. Provide a validation mechanism for entries

Enum

Enum are used within schemas as de-duped lookups. Multiple schema fields can use the same Enum for shaping values.

Usage:

  1. Provide an Enum.code as a lookup instead of a values list when supplying SchemaFieldFormat to a schemafield.
  2. Provide the instanciated Enum to Schema.attach_lookup on a compiled Schema or MergedSchema.

or

  1. Provide an instanciated Enum as a lookup instead of a values list when supplying SchemaFieldFormat to a schemafield.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

bstk_datatables-0.2.1.tar.gz (8.3 kB view details)

Uploaded Source

Built Distribution

bstk_datatables-0.2.1-py3-none-any.whl (9.1 kB view details)

Uploaded Python 3

File details

Details for the file bstk_datatables-0.2.1.tar.gz.

File metadata

  • Download URL: bstk_datatables-0.2.1.tar.gz
  • Upload date:
  • Size: 8.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.4

File hashes

Hashes for bstk_datatables-0.2.1.tar.gz
Algorithm Hash digest
SHA256 885e376f5054c0e92141bdcfed3f7420077e9519894278f756029f8e9853b246
MD5 11dcf8ddfb1a6552a40e58728fd50d7c
BLAKE2b-256 dd348cc57de9789a3e7cd885758dbb52c70b2a573506d7ed2d419530c4297da0

See more details on using hashes here.

File details

Details for the file bstk_datatables-0.2.1-py3-none-any.whl.

File metadata

File hashes

Hashes for bstk_datatables-0.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 44c6a4df069c151c8e400ffcaa924c33d3539eaf128e89e4d110d91fbcf5a5c4
MD5 aefd0f8122444fc1acb83525a6532cac
BLAKE2b-256 c99e68dc5042538eed4c11016b056437885eab1368e9a3beb264db8de56c4a30

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page