Function-first tunables with auto JSON/YAML/TOML config & schema, per-executable composition (tags or trace), and runtime auto-injection.
Project description
Function-first tunable parameters for Generated flags look like:
--debug --model.hidden_units --model.dropout --train.epochs --train.batch_size --train.optimizer
Root-level parameters (from Main class or no namespace) appear directly as --param, while nested parameters use dotted notation. Boolean flags support --no-... negation via jsonargparse's BooleanOptionalAction.n apps — with:
- Ergonomic @tunable decorator (declare per‑function user parameters right where they live).
- Centralized parameter classes (
TunableParameters) for a single source of truth with inheritance-based namespaces. - Automatic Pydantic models → JSON & JSON Schema (rich defaults, validation constraints, literals, Paths, etc.).
- Two composition strategies:
- By app tags (
apps=("train", "serve", ...)) for explicit executable groupings. - By static AST call graph of an entrypoint (no tags needed) – generate a config from just a function without executing user code.
- By app tags (
- Runtime auto‑injection (
use_config) so decorated functions receive values transparently. - CLI flag generation (argparse / jsonargparse) with defaults & help text sourced from Pydantic Field metadata.
- Deterministic merge order: defaults ← optional config file ← CLI overrides.
NOTE: Previous versions used runtime tracing (schema_by_trace, add_flags_by_trace via tracing). This has been replaced by static AST analysis for zero‑execution safety and reproducibility. The compatibility alias
add_flags_by_tracestill works but now delegates to static analysis.
Install
pip install tunablex # (or your project env)
Quick Tour
1. Declare tunables
Option A: Direct decoration (explicit namespaces)
from typing import Literal
from pydantic import Field
from tunablex import tunable
@tunable("hidden_units", "dropout", namespace="model", apps=("train",))
def build_model(hidden_units: int = Field(128, ge=1, description="Hidden units"),
dropout: float = Field(0.2, ge=0.0, le=1.0, description="Dropout")):
...
@tunable("epochs", "batch_size", "optimizer", namespace="train", apps=("train",))
def train(epochs: int = Field(10, ge=1, description="Epochs"),
batch_size: int = Field(32, ge=1, description="Batch size"),
optimizer: Literal["adam", "sgd"] = Field("adam", description="Optimizer")):
...
Option B: Centralized parameters (inheritance-based namespaces)
from typing import Literal
from pydantic import Field
from tunablex import tunable, TunableParameters
# Define your parameter schema once
class Main(TunableParameters):
"""Root-level parameters (appear as --param in CLI, at JSON root)."""
debug: bool = Field(False, description="Enable debug mode")
class Model(Main):
"""Model parameters (--model.param, under 'model' in JSON)."""
hidden_units: int = Field(128, ge=1, description="Hidden units")
dropout: float = Field(0.2, ge=0.0, le=1.0, description="Dropout")
class Train(Main):
"""Training parameters (--train.param, under 'train' in JSON)."""
epochs: int = Field(10, ge=1, description="Epochs")
batch_size: int = Field(32, ge=1, description="Batch size")
optimizer: Literal["adam", "sgd"] = Field("adam", description="Optimizer")
# Use the centralized parameters in your functions
@tunable("hidden_units", "dropout", apps=("train",))
def build_model(hidden_units=Model.hidden_units, dropout=Model.dropout, debug=Main.debug):
if debug:
print(f"Building model: {hidden_units} units, {dropout} dropout")
...
@tunable("epochs", "batch_size", "optimizer", apps=("train",))
def train(epochs=Train.epochs, batch_size=Train.batch_size, optimizer=Train.optimizer):
print(f"Training: {epochs} epochs, batch {batch_size}, optimizer {optimizer}")
...
The centralized approach provides a single source of truth for your parameters—define once, use everywhere! The class hierarchy automatically determines namespaces: Model(Main) creates the "model" namespace, and parameters from Main appear at the root level.
2. Compose a config model (Explicit App Tags)
from tunablex import schema_for_apps, defaults_for_apps, make_app_config_for
schema = schema_for_apps("train") # JSON Schema dict
defaults = defaults_for_apps("train") # Default values dict
AppConfig = make_app_config_for("train") # Pydantic model type
3. Compose a config model (Static Entry Analysis – No Tags)
from tunablex import schema_by_entry_ast, make_app_config_for_entry
# Suppose train_main() calls several @tunable functions (directly or nested)
from mypkg.pipeline import train_main
schema, defaults, namespaces = schema_by_entry_ast(train_main)
AppConfig = make_app_config_for_entry(train_main)
The static analyzer parses the entrypoint’s source and gathers directly called function names (simple, safe heuristic) to select matching registered tunable namespaces.
4. Use a config at runtime
from tunablex import use_config
cfg = AppConfig(**{...}) # or AppConfig.model_validate(loaded_json)
with use_config(cfg):
train_main() # All @tunable calls see their section injected
5. Generate schema & defaults files (entrypoint)
from tunablex import schema_by_entry_ast, write_schema
schema, defaults, _ = schema_by_entry_ast(train_main)
write_schema("train_config", schema, defaults) # writes train_config.schema.json + train_config.json
CLI Integration
jsonargparse (App Tags)
from jsonargparse import ArgumentParser
from tunablex import add_flags_by_app, build_cfg_from_file_and_args, use_config
import examples.myapp.pipeline as pipeline
parser = ArgumentParser(prog="train_jsonarg_app")
parser.add_argument("--config", help="Optional config JSON")
AppConfig = add_flags_by_app(parser, app="train")
args = parser.parse_args()
cfg_dict = build_cfg_from_file_and_args(AppConfig, args)
cfg = AppConfig.model_validate(cfg_dict)
with use_config(cfg):
pipeline.train_main()
jsonargparse (Static Entry Analysis)
from jsonargparse import ArgumentParser
from tunablex import add_flags_by_entry, build_cfg_from_file_and_args, use_config
import examples.myapp.pipeline as pipeline
parser = ArgumentParser(prog="train_jsonarg_trace") # name preserved for backwards compat
parser.add_argument("--config", help="Optional config JSON")
AppConfig = add_flags_by_entry(parser, pipeline.train_main) # or add_flags_by_trace(...)
args = parser.parse_args()
cfg_dict = build_cfg_from_file_and_args(AppConfig, args)
cfg = AppConfig.model_validate(cfg_dict)
with use_config(cfg):
pipeline.train_main()
Generated flags look like:
--model.hidden_units --model.dropout --model.preprocess.dropna ... --train.epochs --train.batch_size --train.optimizer
Boolean flags support --no-... negation via jsonargparse’s BooleanOptionalAction.
argparse (Entry Analysis)
See examples/argparse_trace/train_trace.py for schema generation & loading using static analysis (schema_by_entry_ast, load_config_for_entry).
Config Merge Order
- Pydantic defaults (from each
@tunableField / default value) - JSON file loaded via
--config(if provided) - CLI overrides (flags explicitly present on the command line)
This precedence is verified by the test suite (tests/test_overrides.py).
API Reference (Exports)
- Decorator:
tunable - Centralized parameters:
TunableParameters(base class for inheritance-based namespaces) - Composition (apps):
make_app_config_for,schema_for_apps,defaults_for_apps,load_app_config - Composition (entry):
make_app_config_for_entry,schema_by_entry_ast,load_config_for_entry - Schema output:
write_schema - Runtime:
use_config - CLI helpers:
add_flags_by_app,add_flags_by_entry,add_flags_by_trace(alias),build_cfg_from_file_and_args
Migration From Tracing
Previous API: schema_by_trace, add_flags_by_trace(entrypoint) performed runtime execution to discover call chains. These have been superseded by static AST analysis:
- Use
schema_by_entry_ast(entrypoint)instead ofschema_by_trace. - Use
add_flags_by_entry(alias:add_flags_by_trace) for CLI flag generation. Benefits: - No side‑effects or data loading just to build a config schema.
- Faster repeated schema generation in CI / docs.
- Works in restrictive or sandboxed environments.
Examples Directory
examples/myapp/pipeline.py– shared tunable functions (traditional approach).examples/myapp/params.py+pipeline_params.py– centralizedTunableParametersapproach.examples/argparse_app/train_app.py– classic app‑tag flow.examples/jsonargparse_app/train_jsonarg_app.py– jsonargparse + tags.examples/jsonargparse_app/train_jsonarg_params.py– jsonargparse + centralized parameters.examples/argparse_trace/train_trace.py– entrypoint static analysis with schema generation.examples/jsonargparse_trace/train_jsonarg_trace.py– jsonargparse + static analysis.examples/trace_generate_schema.py– write schema + defaults to disk (AST based).
Run tests:
pytest -q
Philosophy
Keep tunable definition close to the logic; avoid giant central configs. Let the decorator accumulate structure automatically while remaining explicit and type‑checked. Provide zero‑execution schema generation so packaging, documentation, and deployment pipelines stay safe and reproducible.
Contributors
Thanks to all the people who have contributed to tunableX:
- Jacques PAPPER (@jackpap) - Original author and maintainer
- Vincent Drouet (@vincentdrouet) - Core contributor
AI Development Assistance
- Claude Sonnet 4 - implementation
- ChatGPT 5 - implementation
See CONTRIBUTORS.md for more details.
We welcome contributions! Feel free to open issues, submit PRs, or reach out with ideas.
License
MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file tunablex-0.1.3.tar.gz.
File metadata
- Download URL: tunablex-0.1.3.tar.gz
- Upload date:
- Size: 13.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
cd2e2d517d4eb3424b52db424b03fb0e48156eef40830de26cfb59b2a8b2fed6
|
|
| MD5 |
f236a4889c1e8ec78fc6a9c0126e0337
|
|
| BLAKE2b-256 |
47444884055b6b1c243e564c1fa582b42bacc86e6f2c5f15f526195f78ddf33d
|
Provenance
The following attestation bundles were made for tunablex-0.1.3.tar.gz:
Publisher:
python-publish.yml on jackpap/tunableX
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
tunablex-0.1.3.tar.gz -
Subject digest:
cd2e2d517d4eb3424b52db424b03fb0e48156eef40830de26cfb59b2a8b2fed6 - Sigstore transparency entry: 563414703
- Sigstore integration time:
-
Permalink:
jackpap/tunableX@1d5380635c829654ada9446aa7b2068c8a1f9ecc -
Branch / Tag:
refs/tags/v0.1.3 - Owner: https://github.com/jackpap
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
python-publish.yml@1d5380635c829654ada9446aa7b2068c8a1f9ecc -
Trigger Event:
release
-
Statement type:
File details
Details for the file tunablex-0.1.3-py3-none-any.whl.
File metadata
- Download URL: tunablex-0.1.3-py3-none-any.whl
- Upload date:
- Size: 16.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6c4de4464ff6221f26425eb346a11895bdf7683f29a04bed2f045bb6b23209b5
|
|
| MD5 |
0b4a4e3c4fbd5e2dbdb342dc3ffa206f
|
|
| BLAKE2b-256 |
f221f7b479768066dd575762eeccacf53e061e61285a37f7c625a92a90702804
|
Provenance
The following attestation bundles were made for tunablex-0.1.3-py3-none-any.whl:
Publisher:
python-publish.yml on jackpap/tunableX
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
tunablex-0.1.3-py3-none-any.whl -
Subject digest:
6c4de4464ff6221f26425eb346a11895bdf7683f29a04bed2f045bb6b23209b5 - Sigstore transparency entry: 563414725
- Sigstore integration time:
-
Permalink:
jackpap/tunableX@1d5380635c829654ada9446aa7b2068c8a1f9ecc -
Branch / Tag:
refs/tags/v0.1.3 - Owner: https://github.com/jackpap
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
python-publish.yml@1d5380635c829654ada9446aa7b2068c8a1f9ecc -
Trigger Event:
release
-
Statement type: