Skip to main content

Declarative dataclass settings.

Project description

dataclass-settings

Actions Status Coverage Status Documentation Status

dataclass-settings intends to work with any PEP-681-compliant dataclass-like object, including but not limited to:

dataclass-settings owes its existence pydantic-settings, in that pydantic-settings will be a benchmark for dataclass-settings's featureset. However it was bourne out of frustration with pydantic-setting's approach to implementing that featureset.

Example

from __future__ import annotations
from dataclass_settings import load_settings, Env, Secret
from pydantic import BaseModel


class Example(BaseModel):
    env: Annotated[str, Env("ENVIRONMENT")] = "local"
    dsn: Annotated[str, Env("DSN"), Secret('dsn')] = "dsn://"

    sub_config: SubConfig


class SubConfig(BaseModel):
    nested: Annotated[int, Env("NESTED")] = "4"


example: Example = load_settings(Example)

# or, if you want `nested` to be `SUB_CONFIG_NESTED`
example: Example = load_settings(Example, nested_delimiter='_')

vs Pydantic Settings

Simplicity

  • pydantic-settings alters how you go about defining your normal pydantic models. You need to switch (some of the) base classes, you need to configure the magical model_config = SettingsConfigDict(...) object, etc.

    The model becomes inherently entangled with the settings-loading library.

  • dataclass-settings attaches targeted Annotations metadata to a vanilla pydantic model. You can choose to not use load_settings (for example, in tests), and construct the model instance however you'd like.

Clarity

  • pydantic-settings makes it really, really difficult to intuit what the concrete environment varibale that's going to be loaded for a given field is actually going to be. Based on my own experience, and from perusing their issue tracker, it seems like this is not an uncommon experience.

    The combination of field name, SettingsConfigDict settings, casing, alias/validation_alias/serialization_alias, and relative position of the env var in the greater config all contribute to it being a task to deduce which concrete name will be used when loading.

  • dataclass-settings by default requires an explicit, concrete name, which maps directly to the value being loaded (Env('FOO') loads FOO, for sure!)

    If you want to opt into a less explcict, more inferred setup (like pydantic-settings), you can do so by utilizing the nested_delimiter='_' and infer_name=True arguments.

Typing

  • pydantic-settings does not play super well with type checkers, necessitating the use of a mypy plugin for it to not emit type errors into user code.

    The code recommended in their documentation for namespacing settings, looks like:

    class Settings(BaseSettings):
        more_settings: SubModel = SubModel()
    

    This only type-checks with mypy (after using the plugin), but not pyright/pylance. Additionally, this actually evaluates the SubModel constructor during module parsing!

    These issues seem(?) to be inherent to the strategy of subclassing BaseModel, and building in its logic into the object construction process

  • dataclass-settings sidesteps this problem by decoupling the definition of the settings from the loading of settings.

    As such, you're more able to define the model, exactly as you would have with vanilla pydantic:

    class Settings(BaseModel):
        more_settings: SubModel
    

    Internally, the load_settings function handles the work of constructing the requisite input structure pydantic expects to construct the whole object tree.

Compatibility

  • pydantic-settings's BaseSettings inherits from pydantic's BaseModel. And thus can only function against pydantic models, as the name would imply.

  • dataclass-settings's primary entrypoint is a function that accepts a supportable type. As such, it can theoretically support any type that has a well defined object structure, like all of pydantic, dataclasses, and attrs.

    Practically, pydantic has the most robust system for parsing/validating a json-like structure into the models, so it's probably to be the most flexible anyways. But for many simple cases, particuarly those without nesting, or that only deal in simple types (like int, float, str, etc); then dataclasses/attrs can certainly provide a similar experience.

Flexibility

  • At time of writing, pydantic-settings's strategy around "loaders", i.e. supportable settings sources is relatively inflexible. Their issue tracker contains a decent number of requests for a more flexible way of defining settings priorities among different loaders, or even using different settings from within a loader.

    This, at least, doesn't seem to be an inherent issue to the library necessarily. Just that at present, their API appears to try to reuse pydantic's Field and alias mechanisms to infer the settings for all loaders.

  • dataclass-settings instead annotates each field individually, with the loaders that field should use. That means you can have different priorities (or entirely different loaders!) per field.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dataclass_settings-0.2.3.tar.gz (14.6 kB view details)

Uploaded Source

Built Distribution

dataclass_settings-0.2.3-py3-none-any.whl (15.7 kB view details)

Uploaded Python 3

File details

Details for the file dataclass_settings-0.2.3.tar.gz.

File metadata

  • Download URL: dataclass_settings-0.2.3.tar.gz
  • Upload date:
  • Size: 14.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/4.0.2 CPython/3.11.7

File hashes

Hashes for dataclass_settings-0.2.3.tar.gz
Algorithm Hash digest
SHA256 d25737ddd244b768c40069abe7fdac4b31cbe1a20f39f6574ff15b6614ec5271
MD5 76a7f5187bba3ed19ea8482703753a78
BLAKE2b-256 8dc6e341ee922661f17863f9d583860ed84da9374b3d20a71e9271ebe80b7832

See more details on using hashes here.

File details

Details for the file dataclass_settings-0.2.3-py3-none-any.whl.

File metadata

File hashes

Hashes for dataclass_settings-0.2.3-py3-none-any.whl
Algorithm Hash digest
SHA256 3df134d2283eddd5e7b6b38f453faa967d7f6fa6e25e3095d3a564171476de82
MD5 52b307e8b5bbc87ac73b15ec89b41f47
BLAKE2b-256 032fbaa4ee5da4a51f2bd37ca11415f7ccf64b1d2269424f9c1a57756b8635fb

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page