Skip to main content

HyperArgs is a typed configuration library with automatic WEB GUI for Python.

Project description

HyperArgs

HyperArgs HyperArgs is a typed configuration library with automatic WEB GUI for Python.

It lets you define hyperparameters and settings as strongly typed Python classes, parse them from multiple sources (command line, JSON, TOML, YAML) or configure them from a WEB GUI interface, enforce dependency constraints, and monitor field changes with decorators.

Quick Start

1. Install

pip install hyperargs

2. Have a try

  • (Recommand!) From GUI interface:
python example/example.py --from_web

You can save the configuration to a file after you have configured it.

demo

  • From saved configuration file:
python example/example.py --config_path example/TrainConf.toml

Supported formats: .json, .toml, .yaml, .yml.

  • From command line flags:
python train.py --parse_json '{
  "batch_size": 32,
  "num_epochs": 10,
  "optimizer_conf": {
    "lr": 5e-5,
    "momentum": 0.04
  },
  "optimizer_type": "sgd",
  "use_gpu": true
}'

3. Config dependencies

Many programs require dynamic dependencies between fields. HyperArgs lets you define monitors and dependencies between fields to dynamically update fields when value changes.

In the example.py, the config of optimizer_conf depends on optimizer_type. Thus we add a monitor method change_optimizer to switch the config to the correct type when optimizer_type is being setted.

During the initialization or parsing process, the monitors will be triggered to update the dependent fields. To ensure that the conditioned fields gets updated correctly, you should set the dependency order correctly. In the example.py, we set the dependency order as optimizer_type -> int_arg -> len_lst. So the change_optimizer monitor will be triggered before parsing arguments for the optimizer_conf.

In another case, the value of conditioned_arg depends on int_arg. Thus we add the dependency conditioned_arg -> int_arg. During the parsing process, the change_b monitor will be triggered after parsing the conditioned_arg, so the conditioned_arg will be updated according to the value of int_arg rather than the default value.

@add_dependency("optimizer_type", "optimizer_conf")
@add_dependency("conditioned_arg", "int_arg")
@add_dependency("len_lst", "lst")
class TrainConf(Conf):
    message = StrArg("Hello World!")
    batch_size = IntArg(32, min_value=1)
    num_epochs = IntArg(10, min_value=1)
    optimizer_type = OptionArg("adam", options=["adam", "sgd"])
    use_gpu = BoolArg(True)
    optimizer_conf = Conf()

    int_arg = IntArg(100)
    conditioned_arg = IntArg(200)
    len_lst = IntArg(0, min_value=0)
    lst = []

    @monitor_on("optimizer_type")   # Change "optimizer_conf" when "optimizer_type" changes
    def change_optimizer(self):
        if self.optimizer_type.value() == "adam":
            if not isinstance(self.optimizer_conf, AdamConf):
                self.optimizer_conf = AdamConf()
        elif self.optimizer_type.value() == "sgd":
            if not isinstance(self.optimizer_conf, SGDConf):
                self.optimizer_conf = SGDConf()

    @monitor_on("int_arg")  # Change "conditioned_arg" when "int_arg" changes
    def change_b(self):
        v = self.int_arg.value()
        if v is not None:
            self.conditioned_arg = self.conditioned_arg.parse(v * 2)

    @monitor_on("len_lst")  # Change "lst" when "len_lst" changes
    def change_lst(self):
        len_lst = self.len_lst.value()
        if len_lst is not None:
            if len(self.lst) > len_lst:
                self.lst = self.lst[:len_lst]
            elif len(self.lst) < len_lst:
                self.lst.extend([SGDConf() for _ in range(len_lst - len(self.lst))])

5. Environment variable binding

Arguments can bind directly to environment variables. For example, setting the MASTER_ADDR for distributed training:

import os
os.environ["MASTER_ADDR"] = "192.168.1.42"

class DistConf(Conf):
    master_addr = StrArg("127.0.0.1", env_bind="MASTER_ADDR")

conf = DistConf()
print(conf.master_addr.value())  # "192.168.1.42"

API Overview

You can import the main API directly:

from hyperargs import Conf, add_dependency, monitor_on
from hyperargs.args import IntArg, FloatArg, StrArg, BoolArg, OptionArg
  • Arg subclasses
    • IntArg(default, min_value=None, max_value=None, allow_none=False, env_bind=None)
    • FloatArg(default, min_value=None, max_value=None, allow_none=False, env_bind=None)
    • StrArg(default, allow_none=False, env_bind=None)
    • BoolArg(default, env_bind=None)
    • OptionArg(default, options, allow_none=False, env_bind=None)
  • Conf — base class for config schemas.
  • monitor_on(fields) — decorator to watch fields and trigger methods.
  • add_dependency(parent, child) — enforce field dependency order.

Example Workflow

# config.yaml
learning_rate: 0.01
batch_size: 64
num_epochs: 20
optimizer: sgd
use_gpu: false
conf = TrainConf.parse_command_line()
# run: python train.py --config_path config.yaml
print(conf.optimizer.value())  # "sgd"

Roadmap

Stage 1 — Config files & strings ✅

  • Parse from JSON, TOML, YAML strings
  • Parse from config files
  • Convert and save configs

Stage 2 — GUI settings ✅

  • Popup browser page to set configs interactively
  • Export configs after editing

Stage 3 — Command line flags 🚧

  • Parse configs directly from CLI flags
    (e.g. --optimizer.params.learning_rate 0.01 --batch_size 64)

License

Apache License 2.0

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hyperargs-0.1.3.tar.gz (2.0 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

hyperargs-0.1.3-py3-none-any.whl (17.3 kB view details)

Uploaded Python 3

File details

Details for the file hyperargs-0.1.3.tar.gz.

File metadata

  • Download URL: hyperargs-0.1.3.tar.gz
  • Upload date:
  • Size: 2.0 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for hyperargs-0.1.3.tar.gz
Algorithm Hash digest
SHA256 e5f03fac57791b1e21c4011996c51c4a78c0de50b5ffd162f5e5ade664c1f6fa
MD5 96dbc3bfba3212bb29ccd7ffcaf9042b
BLAKE2b-256 361fb700ad10bd41cd93635ff44e8f109e22124efba5e70a0b57519103e0e1b4

See more details on using hashes here.

Provenance

The following attestation bundles were made for hyperargs-0.1.3.tar.gz:

Publisher: python-publish.yml on TYTTYTTYT/HyperArgs

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file hyperargs-0.1.3-py3-none-any.whl.

File metadata

  • Download URL: hyperargs-0.1.3-py3-none-any.whl
  • Upload date:
  • Size: 17.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for hyperargs-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 574eea23d5212bcd5a5913589096ed985953f84ff79cdb6b6277b4d03e5de3af
MD5 4a25e0493e19e4682ec198a7b233d034
BLAKE2b-256 84b420ebb4f9d21ffdfb5173e4a8906b72c23e5d6b5f6e02ad8e8605310871f2

See more details on using hashes here.

Provenance

The following attestation bundles were made for hyperargs-0.1.3-py3-none-any.whl:

Publisher: python-publish.yml on TYTTYTTYT/HyperArgs

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page