Skip to main content

HyperArgs is a typed configuration library for Python. Define settings as type-annotated classes; HyperArgs parses command-line flags, configuration files, and environment variables into a single, type-checked object, with an optional browser-based editor.

Project description

HyperArgs

HyperArgs is a typed configuration library for Python.
It lets you define hyperparameters and settings as strongly typed Python classes, parse them from multiple sources (command line, JSON, TOML, YAML), enforce dependency constraints, and monitor field changes with decorators.

HyperArgs provides:

  • Typed argumentsIntArg, FloatArg, StrArg, BoolArg, OptionArg.
  • Config classes — subclass Conf to declare structured, type-safe settings.
  • Multi-source parsing — load from CLI, JSON, TOML, YAML, or dict.
  • Dependency management — declare relationships between fields.
  • Change monitoring — automatically trigger updates when fields change.
  • Environment variable binding — args can bind to env vars via env_bind.

Installation

pip install hyperargs

Quick Start

1. Define a configuration

# file: train.py

from hyperargs import Conf, add_dependency, monitor_on
from hyperargs.args import IntArg, FloatArg, StrArg, BoolArg, OptionArg

class TrainConf(Conf):
    learning_rate = FloatArg(0.001, min_value=1e-6, max_value=1.0)
    batch_size = IntArg(32, min_value=1)
    num_epochs = IntArg(10, min_value=1)
    optimizer = OptionArg("adam", options=["adam", "sgd", "rmsprop"])
    use_gpu = BoolArg(True)

    @monitor_on("learning_rate")
    def adjust_schedule(self):
        print(f"Learning rate changed to {self.learning_rate.value()}")

conf = TrainConf.parse_command_line(strict=True)
print(conf)

...

2. Parse configurations

  • From command line:
python train.py --config_path config.yaml

Supported formats: .json, .toml, .yaml, .yml.

  • From Python dict:
conf = TrainConf.from_dict({"learning_rate": 0.01, "batch_size": 64})
print(conf.to_yaml())

3. Save configurations

conf.save_to_file("config.json")

Supported formats: .json, .toml, .yaml, .yml.


4. Add dependencies

For example, you may want to ensure that the optimizer type is set before its corresponding optimizer config.

class OptimizerConf(Conf):
    lr: FloatArg(1e-3)

class AdamConf(OptimizerConf):
    beta1 = FloatArg(0.9)
    beta2 = FloatArg(0.999)

class SGDConf(OptimizerConf):
    momentum = FloatArg(0.9)

# Ensure optimizer type is decided first, then configs are parsed
@Conf.add_dependency("optim_type", "configs")
class OptimConf(Conf):
    optim_type = OptionArg("adam", options=["adam", "sgd"])
    configs = OptimizerConf()

    # Switch the config to the correct type when optim_type is being setted
    @Conf.monitor_on('optim_type')
    def init_configs(self) -> None:
        if self.optim_type.value() == 'adam':
            self.configs = AdamConf()
        elif self.optim_type.value() == 'sgd':
            self.configs = SGDConf()

5. Environment variable binding

Arguments can bind directly to environment variables. For example, setting the MASTER_ADDR for distributed training:

import os
os.environ["MASTER_ADDR"] = "192.168.1.42"

class DistConf(Conf):
    master_addr = StrArg("127.0.0.1", env_bind="MASTER_ADDR")

conf = DistConf()
print(conf.master_addr.value())  # "192.168.1.42"

API Overview

You can import the main API directly:

from hyperargs import Conf, add_dependency, monitor_on
from hyperargs.args import IntArg, FloatArg, StrArg, BoolArg, OptionArg
  • Arg subclasses
    • IntArg(default, min_value=None, max_value=None, allow_none=False, env_bind=None)
    • FloatArg(default, min_value=None, max_value=None, allow_none=False, env_bind=None)
    • StrArg(default, allow_none=False, env_bind=None)
    • BoolArg(default, allow_none=False, env_bind=None)
    • OptionArg(default, options, allow_none=False, env_bind=None)
  • Conf — base class for config schemas.
  • monitor_on(fields) — decorator to watch fields and trigger methods.
  • add_dependency(parent, child) — enforce field dependency order.

Example Workflow

# config.yaml
learning_rate: 0.01
batch_size: 64
num_epochs: 20
optimizer: sgd
use_gpu: false
conf = TrainConf.parse_command_line()
# run: python train.py --config_path config.yaml
print(conf.optimizer.value())  # "sgd"

Roadmap

Stage 1 — Config files & strings ✅

  • Parse from JSON, TOML, YAML strings
  • Parse from config files
  • Convert and save configs

Stage 2 — GUI settings ✅

  • Popup browser page to set configs interactively
  • Export configs after editing

Stage 3 — Command line flags 🚧

  • Parse configs directly from CLI flags
    (e.g. --optimizer.params.learning_rate 0.01 --batch_size 64)

License

Apache License 2.0

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hyperargs-0.1.0.tar.gz (17.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

hyperargs-0.1.0-py3-none-any.whl (16.8 kB view details)

Uploaded Python 3

File details

Details for the file hyperargs-0.1.0.tar.gz.

File metadata

  • Download URL: hyperargs-0.1.0.tar.gz
  • Upload date:
  • Size: 17.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for hyperargs-0.1.0.tar.gz
Algorithm Hash digest
SHA256 9a6dc1ddc3c84d3b721f8ff6009b8553e4e4ee047662a5f842aeb938bab23bc3
MD5 cee64d68ca0dcb6024d4df49fee5f6fa
BLAKE2b-256 3b0bda8be43909866390dc16bfb5268a873939c163b40791f3f9e4dcc1bdb3c3

See more details on using hashes here.

Provenance

The following attestation bundles were made for hyperargs-0.1.0.tar.gz:

Publisher: python-publish.yml on TYTTYTTYT/HyperArgs

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file hyperargs-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: hyperargs-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 16.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for hyperargs-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 afd743cb6d43ae60542dfcee487beb7a83ce92d756f4476a9744e59ee83f1b82
MD5 94334dc79fea100a6dfd80ae6dfd0cad
BLAKE2b-256 d0b768c654c74dbca8987acce612ebfd7be14a1a26351eaf93bdb9e151c17816

See more details on using hashes here.

Provenance

The following attestation bundles were made for hyperargs-0.1.0-py3-none-any.whl:

Publisher: python-publish.yml on TYTTYTTYT/HyperArgs

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page