Automatic Pydantic config generation from function signatures with hyperparameters
Project description
nonfig
Automatic Pydantic config generation from class and function signatures.
nonfig turns any class or function into a configurable, serializable, and validated
component. By generating Pydantic models directly from your code's signatures, it
eliminates boilerplate while enforcing type safety and validation.
Quick Example
from nonfig import configurable, DEFAULT, Hyper, Ge
@configurable
class Optimizer:
def __init__(self, lr: Hyper[float, Ge(0)] = 0.01, momentum: float = 0.9):
self.lr = lr
self.momentum = momentum
@configurable
class Model:
def __init__(
self,
hidden_size: int = 128,
optimizer: Optimizer = DEFAULT, # Automatically transforms to Optimizer.Config
):
self.hidden_size = hidden_size
self.optimizer = optimizer
# Instantiate via Config
config = Model.Config(hidden_size=256, optimizer={"lr": 0.001})
model = config.make() # Returns a Model instance with a real Optimizer
Installation
pip install nonfig
# Optional YAML support:
pip install nonfig[yaml]
Core Concepts
Configuring Classes (Greedy)
When you apply @configurable to a class, nonfig captures all parameters from
your __init__ method (or dataclass fields) and generates a .Config class.
@configurable
class Model:
def __init__(self, layers: int = 3, dropout: float = 0.1):
self.layers = layers
self.dropout = dropout
# Direct instantiation still works as usual
m = Model(layers=5)
# Config provides validation and serialization
config = Model.Config(layers=10)
m = config.make() # Returns Model instance
Configuring Functions (Selective)
For functions, @configurable is selective. It only extracts parameters marked
with Hyper[T], DEFAULT, or a Config object. Other parameters are treated as
"runtime arguments" that must be passed when calling the configured function.
from nonfig import configurable, Hyper, Gt
@configurable
def train(
dataset, # Runtime argument (not in Config)
*,
epochs: Hyper[int, Gt(0)] = 10, # Hyperparameter
lr: Hyper[float] = 0.01, # Hyperparameter
):
print(f"Training for {epochs} epochs with lr={lr}")
# Create a configured version of the function
trainer = train.Config(epochs=20).make()
# Call it with runtime arguments
trainer(my_dataset) # Uses epochs=20, lr=0.01
Nested Configuration & DEFAULT
The power of nonfig lies in its ability to compose configurations. Any
@configurable class used as a type hint is automatically transformed into a nested
configuration. Use DEFAULT to automatically instantiate the nested config with its
own defaults.
@configurable
class Trainer:
def __init__(self, optimizer: Optimizer = DEFAULT):
self.optimizer = optimizer
# 'optimizer' can be passed as a dict, an Optimizer.Config, or an Optimizer instance
config = Trainer.Config(optimizer={"lr": 0.0001})
trainer = config.make()
assert isinstance(trainer.optimizer, Optimizer)
Validation & Constraints
Use Hyper[T, ...] to attach Pydantic-style constraints to your parameters. These
are enforced whenever a .Config is created.
from nonfig import configurable, Hyper, Ge, Le, Gt, MinLen, Pattern
@configurable
class Network:
def __init__(
self,
lr: Hyper[float, Gt(0)] = 0.01,
dropout: Hyper[float, Ge(0), Le(1)] = 0.5,
name: Hyper[str, MinLen(3), Pattern(r"^[A-Z]")] = "Net",
): ...
Available constraints: Ge (>=), Gt (>), Le (<=), Lt (<), MinLen, MaxLen,
MultipleOf, Pattern.
Features
Inheritance & Smart Propagation
nonfig supports automatic inheritance. If a subclass of a @configurable class
includes **kwargs in its __init__, it automatically inherits all configurable
parameters from its parents.
@configurable
class Base:
def __init__(self, x: int = 1):
self.x = x
class Sub(Base):
def __init__(self, y: int = 2, **kwargs):
super().__init__(**kwargs)
self.y = y
# Sub.Config now has both 'x' and 'y'
config = Sub.Config(x=10, y=20)
External Components (wrap_external)
Make third-party classes (e.g., from PyTorch or Scikit-learn) configurable without
modifying their code. wrap_external is always greedy, capturing every
parameter in the signature.
from torch.optim import Adam
from nonfig import wrap_external
AdamConfig = wrap_external(Adam, overrides={"lr": Hyper[float, Gt(0)]})
@configurable
class Experiment:
def __init__(self, opt: AdamConfig = DEFAULT):
self.opt = opt
Function Type Proxies (.Type)
When a @configurable function is used as a nested dependency, use .Type for
the type hint to enable proper configuration nesting and IDE support.
@configurable
def preprocessor(data): ...
@configurable
class Pipeline:
"""A pipeline with a nested configurable preprocessor."""
def __init__(self, prep: preprocessor.Type = DEFAULT):
self.prep = prep
### Overriding Nested Defaults
You can override default parameters of nested configurations directly within the type annotation using the `Overrides` utility. This is particularly useful for setting different defaults for the same configurable component used in different contexts.
**Validation:** Since `nonfig` uses `extra="forbid"`, all overrides (including those in constant dictionaries) are strictly validated against the target's signature. Misspelled keys will raise a `ValidationError` at decoration time.
```python
from nonfig import Overrides
@configurable
class Strategy:
def __init__(
self,
# Sets default window to 100 specifically for the long MA
long_ma: Overrides[moving_average.Type, "window": 100] = DEFAULT,
# Sets default window to 3 specifically for the short MA
short_ma: Overrides[moving_average.Type, "window": 3] = DEFAULT,
):
self.long_ma = long_ma
self.short_ma = short_ma
You can also provide a full dictionary of overrides directly:
```python
D = {"window": 100}
@configurable
class Strategy:
# Injects the entire dictionary D as overrides for the moving average component
long_ma: Overrides[moving_average.Type, D] = DEFAULT
Overrides also supports multiple arguments, merging dictionaries, slice syntax, and even nested dictionaries to reach deeper into the configuration tree:
@configurable
class ComplexStrategy:
def __init__(
self,
# Merges dictionary, slice syntax, and nested dictionary
ma: Overrides[
nested_ma.Type, {"other": 10}, "window":50, "sub_component" : {"param": 1.0}
] = DEFAULT,
):
self.ma = ma
Leaf Markers
Use Leaf[T] to disable "Config magic" for a specific parameter. This forces the
field to accept only raw instances of the class, preventing nonfig from
transforming it into a nested configuration.
from nonfig import Leaf
@configurable
class Processor:
def __init__(self, model: Leaf[MyModel]):
# 'model' must be a MyModel instance.
# Passing a dict or MyModel.Config will raise a ValidationError.
self.model = model
Leaf vs. DEFAULT:
DEFAULTis for Composition: It says "This is a sub-component; please auto-configure it using its own defaults."Leaf[T]is for Inversion of Control: It says "This is an external dependency that I (the user) will provide as a pre-instantiated object."
You should not use Leaf[T] = DEFAULT. They represent opposite intents: Leaf
blocks the configuration transformation that DEFAULT specifically requests. This
is now enforced at runtime with a TypeError.
JAX, Flax & jaxtyping
nonfig is designed for machine learning research:
jaxtyping: Preserves dimension metadata and works with@jaxtyped.- Flax: Supports
nn.Moduleclasses natively. - Order: Always place
@configurableas the outermost decorator.
Tooling & CLI
CLI Overrides (run_cli)
Easily run your experiments with command-line overrides using dot-notation.
from nonfig import run_cli
if __name__ == "__main__":
# python train.py model.layers=5 optimizer.lr=0.001
result = run_cli(train)
IDE Support (nonfig-stubgen)
Because nonfig generates classes dynamically, IDEs might need help with
autocompletion. Generate .pyi stubs for your entire project:
nonfig-stubgen src/
Config Loaders
Support for JSON, TOML, and YAML (requires pyyaml):
from nonfig import load_yaml
config_data = load_yaml("config.yaml")
config = Model.Config(**config_data)
Performance
nonfig is optimized for high-performance ML loops. Config.make() uses a cached
factory that is 70-80% faster than a standard Pydantic model_validate.
| Pattern | Latency | Note |
|---|---|---|
Raw Class() |
~0.15µs | Baseline |
Config.make() (Class) |
~0.29µs | Reused config instance |
Config(...).make() |
~1.85µs | Full lifecycle |
Best Practice: Call .make() once outside your training loop.
Comparison
| Feature | nonfig | gin-config | hydra | tyro |
|---|---|---|---|---|
| Philosophy | Code-first | DI | YAML-first | CLI-first |
| Validation | Pydantic | None | Runtime | Parse-time |
| Nesting | Automatic | Manual | Manual | Automatic |
| IDE Support | Stubs (.pyi) | None | Partial | Full |
License
MIT License
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file nonfig-0.1.2.tar.gz.
File metadata
- Download URL: nonfig-0.1.2.tar.gz
- Upload date:
- Size: 120.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
eafe498e0773c28d37cecda0f23d09c4be8c454351b174d53fc630db5719243c
|
|
| MD5 |
d89f05e30d5aa33023b14fdae38c746b
|
|
| BLAKE2b-256 |
c02f79a55eae621a227148ffcc19f7d73cccd5ce7bd9da2fab402efa7cc33749
|
Provenance
The following attestation bundles were made for nonfig-0.1.2.tar.gz:
Publisher:
publish.yml on sencer/nonfig
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
nonfig-0.1.2.tar.gz -
Subject digest:
eafe498e0773c28d37cecda0f23d09c4be8c454351b174d53fc630db5719243c - Sigstore transparency entry: 1335287061
- Sigstore integration time:
-
Permalink:
sencer/nonfig@e09d8abc98fbc246b04702981e4ae08be2ef7b76 -
Branch / Tag:
refs/tags/0.1.2 - Owner: https://github.com/sencer
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@e09d8abc98fbc246b04702981e4ae08be2ef7b76 -
Trigger Event:
release
-
Statement type:
File details
Details for the file nonfig-0.1.2-py3-none-any.whl.
File metadata
- Download URL: nonfig-0.1.2-py3-none-any.whl
- Upload date:
- Size: 50.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
5156175cfe00be6e25fa6fc307502cd1617a5a5f9dd01ccf66be5c95775a8c07
|
|
| MD5 |
417f8b2e4a0d27ac4db97f8b0457150f
|
|
| BLAKE2b-256 |
97b47f0791b17fdabd6ac0897a7b8e6a96f9683f4599479d4dada26718892b07
|
Provenance
The following attestation bundles were made for nonfig-0.1.2-py3-none-any.whl:
Publisher:
publish.yml on sencer/nonfig
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
nonfig-0.1.2-py3-none-any.whl -
Subject digest:
5156175cfe00be6e25fa6fc307502cd1617a5a5f9dd01ccf66be5c95775a8c07 - Sigstore transparency entry: 1335287125
- Sigstore integration time:
-
Permalink:
sencer/nonfig@e09d8abc98fbc246b04702981e4ae08be2ef7b76 -
Branch / Tag:
refs/tags/0.1.2 - Owner: https://github.com/sencer
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@e09d8abc98fbc246b04702981e4ae08be2ef7b76 -
Trigger Event:
release
-
Statement type: