Skip to main content

No project description provided

Project description

Model-lib - pydantic base models with convenient dump methods

Installation

pip install 'model-lib[full]'

Model-lib tutorial: What classes to use as base classes, how to serialize them, and add metadata

  • A library built on top of pydantic
  • Both pydantic v1 and v2 are supported
  • The models: Event and Entity are subclassing pydantic.BaseModel
    • Event is immutable
    • Entity is mutable
    • The specific configuration are:
      • Automatic registering for dumping to the various formats
      • Support different serializers for yaml/json/pretty_json/toml
      • use_enum_values
      • see model_base for details
  • Use dump(model|payload, format) -> str
    • if using an Event|Entity it should "just-work"
    • Alternatively, support custom dumping with register_dumper(instance_type: Type[T],dump_call: DumpCall) (see example below)
  • Use parse_payload(payload, format) to parse to a dict or list
    • bytes
    • str
    • pathlib.Path (format not necessary if file has extension: .yaml|.yml|json|toml)
    • dict|list will be returned directly
    • supports register_parser for adding e.g., a parser for KafkaMessage
  • Use parse_model(payload, t=Type, format) to parse and create a model
    • t not necessary if class name stored in metadata.model_name (see example below)
    • format not necessary if parsing from a file with extension
from datetime import datetime

from freezegun import freeze_time
from pydantic import Field

from model_lib import (
    Entity,
    Event,
    dump,
    dump_with_metadata,
    parse_model,
    FileFormat,
    register_dumper,
)
from model_lib.serialize.parse import register_parser, parse_payload

dump_formats = list(FileFormat)
expected_dump_formats: list[str] = [
    "json",
    "pretty_json",
    "yaml",
    "yml",
    "json_pydantic",
    "pydantic_json",
    "toml",
    "toml_compact",
]
missing_dump_formats = set(FileFormat) - set(expected_dump_formats)
assert not missing_dump_formats, f"found missing dump formats: {missing_dump_formats}"


class Birthday(Event):
    """
    >>> birthday = Birthday()
    """

    date: datetime = Field(default_factory=datetime.utcnow)


class Person(Entity):
    """
    >>> person = Person(name="espen", age=99)
    >>> person.age += 1 # mutable
    >>> person.age
    100
    """

    name: str
    age: int


_pretty_person = """\
{
  "age": 99,
  "name": "espen"
}"""


def test_show_dumping():
    with freeze_time("2020-01-01"):
        birthday = Birthday(date=datetime.utcnow())
        # can dump non-primitives e.g., datetime
        assert dump(birthday, "json") == '{"date":"2020-01-01T00:00:00"}'
    person = Person(name="espen", age=99)
    assert dump(person, "yaml") == "name: espen\nage: 99\n"
    assert dump(person, "pretty_json") == _pretty_person


_metadata_dump = """\
model:
  name: espen
  age: 99
metadata:
  model_name: Person
"""


def test_show_parsing(tmp_path):
    path_json = tmp_path / "example.json"
    path_json.write_text(_pretty_person)
    person = Person(name="espen", age=99)
    assert parse_model(path_json, t=Person) == person
    assert dump_with_metadata(person, format="yaml") == _metadata_dump
    path_yaml = tmp_path / "example.yaml"
    path_yaml.write_text(_metadata_dump)
    assert parse_model(path_yaml) == person  # metadata is used to find the class


class CustomDumping:
    def __init__(self, first_name: str, last_name: str):
        self.first_name = first_name
        self.last_name = last_name

    def __eq__(self, other):
        if isinstance(other, CustomDumping):
            return self.__dict__ == other.__dict__
        return super().__eq__(other)


def custom_dump(custom: CustomDumping) -> dict:
    return dict(full_name=f"{custom.first_name} {custom.last_name}")


register_dumper(CustomDumping, custom_dump)


class CustomKafkaPayload:
    def __init__(self, body: str, topic: str):
        self.topic = topic
        self.body = body


def custom_parse_kafka(payload: CustomKafkaPayload, format: str) -> dict | list: # use Union[dict, list] if py3.9
    return parse_payload(payload.body, format)


register_parser(CustomKafkaPayload, custom_parse_kafka)


def test_custom_dump():
    instance = CustomDumping("Espen", "Python")
    assert dump(instance, "json") == '{"full_name":"Espen Python"}'
    payload = CustomKafkaPayload(
        body='{"first_name": "Espen", "last_name": "Python"}', topic="some-topic"
    )
    assert parse_model(payload, t=CustomDumping) == instance

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

model_lib-1.0.0b2.tar.gz (16.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

model_lib-1.0.0b2-py3-none-any.whl (24.2 kB view details)

Uploaded Python 3

File details

Details for the file model_lib-1.0.0b2.tar.gz.

File metadata

  • Download URL: model_lib-1.0.0b2.tar.gz
  • Upload date:
  • Size: 16.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for model_lib-1.0.0b2.tar.gz
Algorithm Hash digest
SHA256 d0e1693547d3753f8a1d37f6cd34bf7542b9c9e54fd55225ac80e571a10be6bf
MD5 686e85debf7b76a37dce9acbdeeb9d1f
BLAKE2b-256 8c03172f86b4b266033e815caa0d6181147e1791b4eeb8c4fe6922f0613aca5c

See more details on using hashes here.

Provenance

The following attestation bundles were made for model_lib-1.0.0b2.tar.gz:

Publisher: release.yaml on EspenAlbert/py-libs

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file model_lib-1.0.0b2-py3-none-any.whl.

File metadata

  • Download URL: model_lib-1.0.0b2-py3-none-any.whl
  • Upload date:
  • Size: 24.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for model_lib-1.0.0b2-py3-none-any.whl
Algorithm Hash digest
SHA256 3d63315d564d53eacddf1532e31ec121636d387b4863b82aa489f29ee3c8fc33
MD5 8882d7fd118ab9b6a8fd8a0160363afb
BLAKE2b-256 7aa8df4a28c8b6f61e019168a88206e175ab6c76cfa3f988077e33aa8ff46a2c

See more details on using hashes here.

Provenance

The following attestation bundles were made for model_lib-1.0.0b2-py3-none-any.whl:

Publisher: release.yaml on EspenAlbert/py-libs

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page