Hierarchical experiment configuration using pure Python dataclass factories and dependency injection.
Project description
configgle๐คญ
Hierarchical configuration using pure Python dataclasses, with typed factory methods, covariant protocols, and full inheritance support.
Installation
python -m pip install configgle
Example
from configgle import Fig
class Model:
class Config(Fig):
hidden_size: int = 256
num_layers: int = 4
def __init__(self, config: Config):
self.config = config
# Create and modify config
config = Model.Config(hidden_size=512)
# Instantiate the parent class
model = config.make()
print(model.config.hidden_size) # 512
Configs are plain mutable dataclasses, so experiments are just functions that tweak a baseline:
def exp000() -> Model.Config:
return Model.Config()
def exp001() -> Model.Config:
cfg = exp000()
cfg.hidden_size = 512
cfg.num_layers = 8
return cfg
Or use @autofig to auto-generate the Config from __init__:
from configgle import autofig
@autofig
class Model:
def __init__(self, hidden_size: int = 256, num_layers: int = 4):
self.hidden_size = hidden_size
self.num_layers = num_layers
# Config is auto-generated from __init__ signature
model = Model.Config(hidden_size=512).make()
print(model.hidden_size) # 512
Features
Type-safe make()
When Config is defined as a nested class, MakerMeta.__get__ uses the
descriptor protocol to infer the parent class automatically. The return type
of __get__ is Intersection[type[Config], type[Makeable[Parent]]], so
make() knows the exact return type with zero annotation effort:
class Model:
class Config(Fig):
hidden_size: int = 256
def __init__(self, config: Config):
self.hidden_size = config.hidden_size
model = Model.Config(hidden_size=512).make() # inferred as Model
Type checkers that support Intersection (like ty) resolve this fully --
bare Fig is all you need. For type checkers that don't yet support
Intersection (like basedpyright), parameterize with the parent class
name to give the checker the same information explicitly:
class Model:
class Config(Fig["Model"]): # explicit type parameter only for basedpyright
hidden_size: int = 256
def __init__(self, config: Config):
self.hidden_size = config.hidden_size
model: Model = Model.Config(hidden_size=512).make() # returns Model, not object
Without ["Model"], non-ty checkers fall back to Any (so attribute access
works without typecheck suppressions).
Both ty and basedpyright are first-class supported. Here's the full
picture (including Makes, introduced next):
ty |
basedpyright |
|
|---|---|---|
Bare Fig infers parent type |
โ | ๐ก (Any fallback) |
Fig["Parent"] |
โ | โ |
Makes["Child"] needed for inheritance |
โ | โ |
@autofig .Config access |
โ (#143) | โ |
ty gets full inference from Intersection -- bare Fig and inherited
configs just work. basedpyright doesn't support Intersection yet, so it
needs explicit Fig["Parent"] and Makes["Child"] annotations. ty doesn't
yet support class decorator return types, so @autofig-decorated classes need
# ty: ignore[unresolved-attribute] to access .Config; basedpyright
handles this correctly. When Intersection lands in the
type spec, Makes becomes
unnecessary and both checkers will infer everything from bare Fig.
Inheritance with Makes (only for basedpyright)
When a child class inherits a parent's Config, the make() return type would
normally be the parent. Use Makes to re-bind it (again, only needed for basedpyright):
class Animal:
class Config(Fig["Animal"]):
name: str = "animal"
def __init__(self, config: Config):
self.name = config.name
class Dog(Animal):
class Config(Makes["Dog"], Animal.Config):
breed: str = "mutt"
def __init__(self, config: Config):
super().__init__(config)
self.breed = config.breed
dog: Dog = Dog.Config(name="Rex", breed="labrador").make() # returns Dog, not Animal
Makes contributes nothing to the MRO at runtime -- it exists purely for the
type checker (see the type checker table above). When
Intersection lands, Makes
becomes unnecessary.
Covariant Makeable protocol
Makeable[T] is a covariant protocol satisfied by any Fig, InlineConfig,
or custom class with make(), finalize(), and update(). Because it's
covariant, Makeable[Dog] is assignable to Makeable[Animal]:
from configgle import Makeable
def train(config: Makeable[Animal]) -> Animal:
return config.make()
# All valid:
train(Animal.Config())
train(Dog.Config(breed="poodle"))
This makes it easy to write functions that accept any config for a class hierarchy without losing type information.
Nested config finalization
Override finalize() to compute derived fields before instantiation. Nested
configs are finalized recursively:
class Encoder:
class Config(Fig):
c_in: int = 256
mlp: Configurable[nn.Module] = field(default_factory=MLP.Config)
def finalize(self) -> Self:
self = super().finalize()
self.mlp.c_in = self.c_in # propagate dimensions
return self
update() for bulk mutation
Configs support bulk updates from another config or keyword arguments:
cfg = Model.Config(hidden_size=256)
cfg.update(hidden_size=512, num_layers=8)
# Or copy from another config (kwargs take precedence):
cfg.update(other_cfg, num_layers=12)
InlineConfig / PartialConfig
InlineConfig wraps an arbitrary callable and its arguments into a config
object with deferred execution. Use it for classes where all constructor
arguments are known at config time:
from configgle import InlineConfig
import torch.nn as nn
cfg = InlineConfig(nn.Linear, in_features=256, out_features=128, bias=False)
cfg.out_features = 64 # attribute-style access to kwargs
layer = cfg.make() # calls nn.Linear(in_features=256, out_features=64, bias=False)
y = layer(x) # use the constructed module
PartialConfig is shorthand for InlineConfig(functools.partial, fn, ...)
-- use it for functions where some arguments aren't known at config time:
from configgle import PartialConfig
import torch.nn.functional as F
cfg = PartialConfig(F.cross_entropy, label_smoothing=0.1)
loss_fn = cfg.make() # returns functools.partial(F.cross_entropy, label_smoothing=0.1)
loss = loss_fn(logits, targets) # calls F.cross_entropy(logits, targets, label_smoothing=0.1)
Nested configs in args/kwargs are finalized and make()-d recursively, so
both compose naturally with Fig configs.
CopyOnWrite
CopyOnWrite wraps a config tree and lazily copies objects only when mutations
occur. Copies propagate up to parents automatically, so the original is never
touched. This is especially useful inside finalize(), where you want to
derive a variant of a shared sub-config without mutating the original:
from configgle import CopyOnWrite, Fig
class Encoder:
class Config(Fig):
hidden_size: int = 256
encoder: Configurable[nn.Module] = field(default_factory=MLP.Config)
decoder: Configurable[nn.Module] = field(default_factory=MLP.Config)
def finalize(self) -> Self:
self = super().finalize()
# encoder and decoder can share the same MLP.Config object.
# CopyOnWrite lets us tweak the decoder's copy without
# touching the encoder's (or the shared original).
with CopyOnWrite(self) as cow:
cow.decoder.c_out = self.hidden_size * 2
return cow.unwrap
Only the mutated nodes (and their ancestors) are shallow-copied; everything else stays shared.
pprint / pformat
Config-aware pretty printing that hides default values, auto-finalizes before printing, and scrubs memory addresses:
from configgle import Configurable, Fig, pformat
from dataclasses import field
class MLP:
class Config(Fig):
c_in: int = 256
c_out: int = 256
num_layers: int = 2
dropout: float = 0.1
use_bias: bool = True
def __init__(self, config: Config): ...
class Model:
class Config(Fig):
hidden_size: int = 256
num_layers: int = 4
mlp: Configurable[nn.Module] = field(default_factory=MLP.Config)
output_mlp: Configurable[nn.Module] = field(default_factory=MLP.Config)
def __init__(self, config: Config): ...
def exp001():
cfg = Model.Config()
cfg.hidden_size = 512
cfg.num_layers = 12
cfg.mlp.c_in = 512
cfg.mlp.c_out = 1024
cfg.mlp.num_layers = 4
cfg.mlp.dropout = 0.2
cfg.mlp.use_bias = False
cfg.output_mlp.c_in = 1024
cfg.output_mlp.c_out = 256
cfg.output_mlp.dropout = 0.3
return cfg
print(pformat(exp001(), continuation_pipe=0))
# Model.Config(
# hidden_size=512,
# num_layers=12,
# mlp=MLP.Config(
# โ c_in=512,
# โ c_out=1_024,
# โ num_layers=4,
# โ dropout=0.2,
# โ use_bias=False
# ),
# output_mlp=MLP.Config(c_in=1_024, dropout=0.3)
# )
Default values are hidden, continuation pipes show where nested blocks belong,
large numbers get underscores (1_024), and short sub-configs collapse onto
one line.
@autofig for zero-boilerplate configs
When you don't need a hand-written Config, @autofig generates one from
__init__ (see Example above).
Pickling and cloudpickle
Configs are fully compatible with pickle and cloudpickle, including the
parent class reference. This is important for distributed workflows (e.g.,
sending configs across processes):
import cloudpickle, pickle
cfg = Model.Config(hidden_size=512)
cfg_ = pickle.loads(cloudpickle.dumps(cfg))
model = cfg_.make() # parent_class is preserved
Comparison
| configgle | Hydra | Sacred | OmegaConf | Gin | ml_collections | Fiddle | Confugue | |
|---|---|---|---|---|---|---|---|---|
| Pure Python (no YAML/strings) | โ | โ | โ | ๐ก | โ | โ | โ | โ |
Typed make()/build() return |
โ | โ | โ | โ | โ | โ | โ | โ |
| Config inheritance | โ | ๐ก | โ | ๐ก | โ | โ | โ | ๐ก |
| Covariant protocol | โ | โ | โ | โ | โ | โ | โ | โ |
| Nested finalization | โ | โ | โ | โ | โ | โ | โ | โ |
| Copy-on-write | โ | โ | โ | โ | โ | โ | โ | โ |
pickle/cloudpickle |
โ | ๐ก | โ | โ | โ | ๐ก | โ | โ |
| Auto-generated configs | โ | ๐ก | โ | โ | โ | โ | โ | โ |
| GitHub stars | -- | 10.2k | 4.4k | 2.3k | 2.1k | 1.0k | 374 | 21 |
โ = yes, ๐ก = partial, โ = no. Corrections welcome -- open a PR.
How each library works
Hydra (Meta) --
YAML-centric with optional "structured configs" (Python dataclasses registered
in a ConfigStore). Instantiation uses hydra.utils.instantiate(), which
resolves a string _target_ field to an import path -- the return type is
Any. Config composition is done via YAML defaults lists, not class
inheritance. Dataclass inheritance works at the schema level. configen is
an experimental code-generation tool (v0.9.0.dev8) that produces structured
configs from class signatures. Configs survive pickle trivially since
_target_ is a string, not a class reference.
Sacred --
Experiment management framework. Config is defined via @ex.config scopes
(local variables become config entries) or loaded from YAML/JSON files. Sacred
auto-injects config values into captured functions by parameter name
(dependency injection), but does not auto-generate configs from function
signatures. No typed factory methods, no config inheritance, no pickle
support for the experiment/config machinery.
OmegaConf --
YAML-native configuration with a "structured config" mode that accepts
@dataclass schemas. Configs are always wrapped in DictConfig proxy objects
at runtime (not actual dataclass instances). Supports dataclass inheritance
for schema definition. Good pickle support (__getstate__/__setstate__).
No factory method (to_object() returns Any), no auto-generation, no
protocols.
Gin (Google) --
Global string-based registry. You decorate functions with @gin.configurable
and bind parameters via .gin files or gin.bind_parameter('fn.param', val).
There are no config objects -- parameter values live in a global dict keyed by
dotted strings. No typed returns, no config inheritance. The docs state
"gin-configurable functions are not pickleable," though a 2020 PR added
__reduce__ methods that improve support.
ml_collections (Google) --
Dict-like ConfigDict with dot-access, type-checking on mutation, and
FieldReference for lazy cross-references between values. Pure Python, no
YAML. No factory method or typed instantiation. Pickle works for plain configs,
but FieldReference operations that use lambdas internally (.identity(),
.to_int()) fail with standard pickle (cloudpickle handles them).
Fiddle (Google) --
Python-first. You build config graphs with fdl.Config[MyClass] objects and
call fdl.build() to instantiate them. build(Config[T]) -> T is typed via
@overload. Config modification is functional (fdl.copy_with), not
inheritance-based -- there are no config subclasses. @auto_config rewrites a
factory function's AST to produce a config graph automatically. Full
pickle/cloudpickle support.
Confugue --
YAML-based hierarchical configuration. The configure() method instantiates
objects from YAML dicts, with the class specified via a !type YAML tag.
Returns Any. Partial config inheritance via YAML merge keys (<<: *base).
No pickle support, no auto-generation, no protocols.
Citing
If you find our work useful, please consider citing:
@misc{dillon2026configgle,
title={Configgle - Hierarchical experiment configuration using pure Python dataclass factories and dependency injection.},
author={Joshua V. Dillon},
year={2026},
howpublished={Github},
url={https://github.com/jvdillon/configgle},
}
License
Apache License 2.0
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file configgle-1.1.10.tar.gz.
File metadata
- Download URL: configgle-1.1.10.tar.gz
- Upload date:
- Size: 144.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
70003c4cbb76ebe3995ae197a15faaaa65ffae7fed99c81d52466300eabdafba
|
|
| MD5 |
e42591191833aeea2dca589afa925d8d
|
|
| BLAKE2b-256 |
d6ef225fb7950e86dd417669ec3dfa12de143eecc1df4c3579c4eeb6e985c9e1
|
File details
Details for the file configgle-1.1.10-py3-none-any.whl.
File metadata
- Download URL: configgle-1.1.10-py3-none-any.whl
- Upload date:
- Size: 34.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
b629564d2f18511c952030ea3e0bb416a29c5e7dcf45eb6c7538bdd26dad8ece
|
|
| MD5 |
7db537731687317dd49fbab9ccbf0c15
|
|
| BLAKE2b-256 |
c9e4668369dd3e7b6fd43c081c332f07197b7ad4bbafe45497f66a8a5d68090f
|