Skip to main content

Declarative dataclass settings.

Project description

dataclass-settings

Actions Status Coverage Status Documentation Status

dataclass-settings intends to work with any PEP-681-compliant dataclass-like object, including but not limited to:

dataclass-settings owes its existence pydantic-settings, in that pydantic-settings will be a benchmark for dataclass-settings's featureset. However it was bourne out of frustration with pydantic-setting's approach to implementing that featureset.

Example

from __future__ import annotations
from dataclass_settings import load_settings, Env, Secret
from pydantic import BaseModel


class Example(BaseModel):
    env: Annotated[str, Env("ENVIRONMENT")] = "local"
    dsn: Annotated[str, Env("DSN"), Secret('dsn')] = "dsn://"

    sub_config: SubConfig


class SubConfig(BaseModel):
    nested: Annotated[int, Env("NESTED")] = "4"


example: Example = load_settings(Example)

# or, if you want `nested` to be `SUB_CONFIG_NESTED`
example: Example = load_settings(Example, nested_delimiter='_')

vs Pydantic Settings

Simplicity

  • pydantic-settings alters how you go about defining your normal pydantic models. You need to switch (some of the) base classes, you need to configure the magical model_config = SettingsConfigDict(...) object, etc.

    The model becomes inherently entangled with the settings-loading library.

  • dataclass-settings attaches targeted Annotations metadata to a vanilla pydantic model. You can choose to not use load_settings (for example, in tests), and construct the model instance however you'd like.

Clarity

  • pydantic-settings makes it really, really difficult to intuit what the concrete environment varibale that's going to be loaded for a given field is actually going to be. Based on my own experience, and from perusing their issue tracker, it seems like this is not an uncommon experience.

    The combination of field name, SettingsConfigDict settings, casing, alias/validation_alias/serialization_alias, and relative position of the env var in the greater config all contribute to it being a task to deduce which concrete name will be used when loading.

  • dataclass-settings by default requires an explicit, concrete name, which maps directly to the value being loaded (Env('FOO') loads FOO, for sure!)

    If you want to opt into a less explcict, more inferred setup (like pydantic-settings), you can do so by utilizing the nested_delimiter='_' and infer_name=True arguments.

Typing

  • pydantic-settings does not play super well with type checkers, necessitating the use of a mypy plugin for it to not emit type errors into user code.

    The code recommended in their documentation for namespacing settings, looks like:

    class Settings(BaseSettings):
        more_settings: SubModel = SubModel()
    

    This only type-checks with mypy (after using the plugin), but not pyright/pylance. Additionally, this actually evaluates the SubModel constructor during module parsing!

    These issues seem(?) to be inherent to the strategy of subclassing BaseModel, and building in its logic into the object construction process

  • dataclass-settings sidesteps this problem by decoupling the definition of the settings from the loading of settings.

    As such, you're more able to define the model, exactly as you would have with vanilla pydantic:

    class Settings(BaseModel):
        more_settings: SubModel
    

    Internally, the load_settings function handles the work of constructing the requisite input structure pydantic expects to construct the whole object tree.

Compatibility

  • pydantic-settings's BaseSettings inherits from pydantic's BaseModel. And thus can only function against pydantic models, as the name would imply.

  • dataclass-settings's primary entrypoint is a function that accepts a supportable type. As such, it can theoretically support any type that has a well defined object structure, like all of pydantic, dataclasses, and attrs.

    Practically, pydantic has the most robust system for parsing/validating a json-like structure into the models, so it's probably to be the most flexible anyways. But for many simple cases, particuarly those without nesting, or that only deal in simple types (like int, float, str, etc); then dataclasses/attrs can certainly provide a similar experience.

Flexibility

  • At time of writing, pydantic-settings's strategy around "loaders", i.e. supportable settings sources is relatively inflexible. Their issue tracker contains a decent number of requests for a more flexible way of defining settings priorities among different loaders, or even using different settings from within a loader.

    This, at least, doesn't seem to be an inherent issue to the library necessarily. Just that at present, their API appears to try to reuse pydantic's Field and alias mechanisms to infer the settings for all loaders.

  • dataclass-settings instead annotates each field individually, with the loaders that field should use. That means you can have different priorities (or entirely different loaders!) per field.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dataclass_settings-0.3.0.tar.gz (14.8 kB view details)

Uploaded Source

Built Distribution

dataclass_settings-0.3.0-py3-none-any.whl (15.9 kB view details)

Uploaded Python 3

File details

Details for the file dataclass_settings-0.3.0.tar.gz.

File metadata

  • Download URL: dataclass_settings-0.3.0.tar.gz
  • Upload date:
  • Size: 14.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.0.0 CPython/3.12.3

File hashes

Hashes for dataclass_settings-0.3.0.tar.gz
Algorithm Hash digest
SHA256 2fad20ed0b965866d15c89f99bb22e4e7f2a9efff3c40f8cf2a65e326197647f
MD5 0b53d703a9b201b6cbb39bfc4bd0a399
BLAKE2b-256 69e3b5b623304c0b89df1c95806fd9fd3dadcecffe4fd97e45f2d12ecac929f3

See more details on using hashes here.

File details

Details for the file dataclass_settings-0.3.0-py3-none-any.whl.

File metadata

File hashes

Hashes for dataclass_settings-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 d7668c1b5131d00caafad37c4fce69612313cd68f6c6f090bfc838b03e40f607
MD5 cf45cee5e3594eac4292a4d1471a8020
BLAKE2b-256 4f34aae13bab08615760b595a10b38332f92ad1d714fc030031f02d5b77bf8e6

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page