Skip to main content

A polymorphic schema managed semi structured crosslinked data dictionary builder.. BINGO!

Project description

Datatables - structured data library based on schemas

homepage

Dev

  1. poetry install
  2. poetry shell
    -- Make changes --
  3. poetry run pytest
  4. poetry run black bstk_datatables
  5. poetry run flake8 bstk_datatables
    -- Commit & Push --

Install

pip install bstk-datatables

Overview

Datatables act as an intermediary between Marshmallow structures and user defined data storage structures.
It is designed to provide "just enough" sidechannel structure to facilitate building a dynamic schema, (and connecting with "other" interfaces), without losing the advantages afforded by static Marshmallow schemas.

Schema

Schema models are;

  • Schema: A collection of fields and references that make up a partial or complete entry
  • SchemaField: A basic instruction container representing a single value
  • SchemaFieldFormat: The specific instructions for how the field should be collected, represented, formatted and stored
  • SchemaValuesError: The only type of exception raised during schema validation

These schemas and fields are mapped to equivalent Marshmallow structures which provide the entry value validation mechanisms.. ref: Schema.process_values()

Entry

An Entry is a collection of field values, references data, connector references and schema links.

  • .schemata is a list of Schema.code's
  • .table_id is a link back to a Table.uuid
  • .references and .connector_references are unrestricted containers. Two containers are provided to seperate "core" references from "free-form" references.
  • .values is a dict of Field.code => value that conform to the listed schemata

Table

A Table corrals one or more Entry and shapes them towards one or more Schema.

  • .schemata is a list of Schema.code's that all entries must inherit
  • .references and .connectors are unrestricted containers. Two containers are provided to seperate "core" references from "free-form" references (and allows correlation with table entries).

Marshalling and Persistence

All core classes (and Enum) expose an export method which return a dict.
The result of an export() can be unpacked and provided to its constructor.

def test_entry_export():
    data = {
        "uuid": str(uuid4()),
        "table_id": str(uuid4()),
        "name": "Data Entry",
        "references": {"entity_uuid": str(uuid4())},
        "connector_references": {"connector1": "connector_ref"},
        "schemata": ["base"],
        "values": {"base/value1": "XG230"},
    }
    entry = Entry(**data)
    exported = export(entry)
    assert exported == data

The simplest way to handle data persistence is to encapsulate class instanciation and the export method of the relevant class into an ORM or ODM framework.
MergeSchema do not provide an export mechanism because they are not first-class citizens and are designed to work with established Schema structures.

This test provides an example of how to implement persistence with flat files.

Extras

MergedSchema

Tables and Entries support more than a single schema reference.
MergedSchema exists to facilitate mutli-schema validation and field ordering.

Provide Dict[Schema.Code: Schema] as schemata when initialising a MergedSchema and it will:

  1. Process the schema in order
  2. De-dupe fields with the same code (If a later schema includes a field with the same code as a previously loaded schema - that field will be skipped)
  3. Provide a validation mechanism for entries

Enum

Enum are used within schemas as de-duped lookups. Multiple schema fields can use the same Enum for shaping values.

Usage:

  1. Provide an Enum.code as a lookup instead of a values list when supplying SchemaFieldFormat to a schemafield.
  2. Provide the instanciated Enum to Schema.attach_lookup on a compiled Schema or MergedSchema.

or

  1. Provide an instanciated Enum as a lookup instead of a values list when supplying SchemaFieldFormat to a schemafield.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

bstk_datatables-0.1.8.tar.gz (8.0 kB view details)

Uploaded Source

Built Distribution

bstk_datatables-0.1.8-py3-none-any.whl (8.6 kB view details)

Uploaded Python 3

File details

Details for the file bstk_datatables-0.1.8.tar.gz.

File metadata

  • Download URL: bstk_datatables-0.1.8.tar.gz
  • Upload date:
  • Size: 8.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.4

File hashes

Hashes for bstk_datatables-0.1.8.tar.gz
Algorithm Hash digest
SHA256 8fe0f96c4d9942debb5aa1889e8b389f82b32e7eb463518ac38f347373f0a276
MD5 30dda2e47108a15266791bb698ef2ac6
BLAKE2b-256 64665c335913806c78738bd93a170ea308f9dbbf30c71803c7a813bfa48d9c14

See more details on using hashes here.

File details

Details for the file bstk_datatables-0.1.8-py3-none-any.whl.

File metadata

File hashes

Hashes for bstk_datatables-0.1.8-py3-none-any.whl
Algorithm Hash digest
SHA256 9db3eb63ec1c5762e7af71e3614a871057fdd4296bcb79d6ae0c5360983d5752
MD5 7ad50d0a8ec7f92e1e45847334f71199
BLAKE2b-256 e344efac7b2b0781d6a31cf1a785c3ebd152c5549ffb8eb52d5300f18dd82652

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page