Skip to main content

A polymorphic schema managed semi structured crosslinked data dictionary builder.. BINGO!

Project description

Datatables - structured data library based on schemas

homepage

Dev

  1. poetry install
  2. poetry shell
    -- Make changes --
  3. poetry run pytest
  4. poetry run black bstk_datatables
  5. poetry run flake8 bstk_datatables
    -- Commit & Push --

Install

pip install bstk-datatables

Overview

Datatables act as an intermediary between Marshmallow structures and user defined data storage structures.
It is designed to provide "just enough" sidechannel structure to facilitate building a dynamic schema, (and connecting with "other" interfaces), without losing the advantages afforded by static Marshmallow schemas.

Schema

Schema models are;

  • Schema: A collection of fields and references that make up a partial or complete entry
  • SchemaField: A basic instruction container representing a single value
  • SchemaFieldFormat: The specific instructions for how the field should be collected, represented, formatted and stored
  • SchemaValuesError: The only type of exception raised during schema validation

These schemas and fields are mapped to equivalent Marshmallow structures which provide the entry value validation mechanisms.. ref: Schema.process_values()

Entry

An Entry is a collection of field values, references data, connector references and schema links.

  • .schemata is a list of Schema.code's
  • .table_id is a link back to a Table.uuid
  • .references and .connector_references are unrestricted containers. Two containers are provided to seperate "core" references from "free-form" references.
  • .values is a dict of Field.code => value that conform to the listed schemata

Table

A Table corrals one or more Entry and shapes them towards one or more Schema.

  • .schemata is a list of Schema.code's that all entries must inherit
  • .references and .connectors are unrestricted containers. Two containers are provided to seperate "core" references from "free-form" references (and allows correlation with table entries).

Marshalling and Persistence

All core classes (and Enum) expose an export method which return a dict.
The result of an export() can be unpacked and provided to its constructor.

def test_entry_export():
    data = {
        "uuid": str(uuid4()),
        "table_id": str(uuid4()),
        "name": "Data Entry",
        "references": {"entity_uuid": str(uuid4())},
        "connector_references": {"connector1": "connector_ref"},
        "schemata": ["base"],
        "values": {"base/value1": "XG230"},
    }
    entry = Entry(**data)
    exported = export(entry)
    assert exported == data

The simplest way to handle data persistence is to encapsulate class instanciation and the export method of the relevant class into an ORM or ODM framework.
MergeSchema do not provide an export mechanism because they are not first-class citizens and are designed to work with established Schema structures.

This test provides an example of how to implement persistence with flat files.

Extras

MergedSchema

Tables and Entries support more than a single schema reference.
MergedSchema exists to facilitate mutli-schema validation and field ordering.

Provide Dict[Schema.Code: Schema] as schemata when initialising a MergedSchema and it will:

  1. Process the schema in order
  2. De-dupe fields with the same code (If a later schema includes a field with the same code as a previously loaded schema - that field will be skipped)
  3. Provide a validation mechanism for entries

Enum

Enum are used within schemas as de-duped lookups. Multiple schema fields can use the same Enum for shaping values.

Usage:

  1. Provide an Enum.code as a lookup instead of a values list when supplying SchemaFieldFormat to a schemafield.
  2. Provide the instanciated Enum to Schema.attach_lookup on a compiled Schema or MergedSchema.

or

  1. Provide an instanciated Enum as a lookup instead of a values list when supplying SchemaFieldFormat to a schemafield.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

bstk_datatables-0.1.5.tar.gz (7.8 kB view details)

Uploaded Source

Built Distribution

bstk_datatables-0.1.5-py3-none-any.whl (8.5 kB view details)

Uploaded Python 3

File details

Details for the file bstk_datatables-0.1.5.tar.gz.

File metadata

  • Download URL: bstk_datatables-0.1.5.tar.gz
  • Upload date:
  • Size: 7.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.11.3

File hashes

Hashes for bstk_datatables-0.1.5.tar.gz
Algorithm Hash digest
SHA256 a3bfc1733c15228edee0394051b00c41c73b1b7258b9408a820cd36293e3de37
MD5 28153a2f6d77c0bd6477e274333e2545
BLAKE2b-256 5679b9e814a02d911935e67f8c9c218c4b6ca280c47ae80f396e8a1f5513062b

See more details on using hashes here.

File details

Details for the file bstk_datatables-0.1.5-py3-none-any.whl.

File metadata

File hashes

Hashes for bstk_datatables-0.1.5-py3-none-any.whl
Algorithm Hash digest
SHA256 9f3dae4cfad84a583ec7222d2c4c4c4425233c326e54c0e15e455052ceeee148
MD5 4754dd41e0a54a70535a3b414b972125
BLAKE2b-256 d34c912247903033f03f7970426f5c64c162a41e65cb443a3b27f03ce344bc5c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page