Skip to main content

HyperArgs is a typed configuration library with automatic WEB GUI for Python.

Project description

HyperArgs

HyperArgs HyperArgs is a typed configuration library with automatic WEB GUI for Python.

It lets you define hyperparameters and settings as strongly typed Python classes, parse them from multiple sources (command line, JSON, TOML, YAML) or configure them from a WEB GUI interface, enforce dependency constraints, and monitor field changes with decorators.

Quick Start

1. Install

pip install hyperargs

2. Have a try

  • (Recommand!) From GUI interface:
python example/example.py --from_web

You can save the configuration to a file after you have configured it.

demo

  • From saved configuration file:
python example/example.py --config_path example/TrainConf.toml

Supported formats: .json, .toml, .yaml, .yml.

  • From command line flags:
python train.py --parse_json '{
  "batch_size": 32,
  "num_epochs": 10,
  "optimizer_conf": {
    "lr": 5e-5,
    "momentum": 0.04
  },
  "optimizer_type": "sgd",
  "use_gpu": true
}'

3. Config dependencies

Many programs require dynamic dependencies between fields. HyperArgs lets you define monitors and dependencies between fields to dynamically update fields when value changes.

In the example.py, the config of optimizer_conf depends on optimizer_type. Thus we add a monitor method change_optimizer to switch the config to the correct type when optimizer_type is being setted.

During the initialization or parsing process, the monitors will be triggered to update the dependent fields. To ensure that the conditioned fields gets updated correctly, you should set the dependency order correctly. In the example.py, we set the dependency order as optimizer_type -> int_arg -> len_lst. So the change_optimizer monitor will be triggered before parsing arguments for the optimizer_conf.

In another case, the value of conditioned_arg depends on int_arg. Thus we add the dependency conditioned_arg -> int_arg. During the parsing process, the change_b monitor will be triggered after parsing the conditioned_arg, so the conditioned_arg will be updated according to the value of int_arg rather than the default value.

@add_dependency("optimizer_type", "optimizer_conf")
@add_dependency("conditioned_arg", "int_arg")
@add_dependency("len_lst", "lst")
class TrainConf(Conf):
    message = StrArg("Hello World!")
    batch_size = IntArg(32, min_value=1)
    num_epochs = IntArg(10, min_value=1)
    optimizer_type = OptionArg("adam", options=["adam", "sgd"])
    use_gpu = BoolArg(True)
    optimizer_conf = Conf()

    int_arg = IntArg(100)
    conditioned_arg = IntArg(200)
    len_lst = IntArg(0, min_value=0)
    lst = []

    @monitor_on("optimizer_type")   # Change "optimizer_conf" when "optimizer_type" changes
    def change_optimizer(self):
        if self.optimizer_type.value() == "adam":
            if not isinstance(self.optimizer_conf, AdamConf):
                self.optimizer_conf = AdamConf()
        elif self.optimizer_type.value() == "sgd":
            if not isinstance(self.optimizer_conf, SGDConf):
                self.optimizer_conf = SGDConf()

    @monitor_on("int_arg")  # Change "conditioned_arg" when "int_arg" changes
    def change_b(self):
        v = self.int_arg.value()
        if v is not None:
            self.conditioned_arg = self.conditioned_arg.parse(v * 2)

    @monitor_on("len_lst")  # Change "lst" when "len_lst" changes
    def change_lst(self):
        len_lst = self.len_lst.value()
        if len_lst is not None:
            if len(self.lst) > len_lst:
                self.lst = self.lst[:len_lst]
            elif len(self.lst) < len_lst:
                self.lst.extend([SGDConf() for _ in range(len_lst - len(self.lst))])

5. Environment variable binding

Arguments can bind directly to environment variables. For example, setting the MASTER_ADDR for distributed training:

import os
os.environ["MASTER_ADDR"] = "192.168.1.42"

class DistConf(Conf):
    master_addr = StrArg("127.0.0.1", env_bind="MASTER_ADDR")

conf = DistConf()
print(conf.master_addr.value())  # "192.168.1.42"

API Overview

You can import the main API directly:

from hyperargs import Conf, add_dependency, monitor_on
from hyperargs.args import IntArg, FloatArg, StrArg, BoolArg, OptionArg
  • Arg subclasses
    • IntArg(default, min_value=None, max_value=None, allow_none=False, env_bind=None)
    • FloatArg(default, min_value=None, max_value=None, allow_none=False, env_bind=None)
    • StrArg(default, allow_none=False, env_bind=None)
    • BoolArg(default, env_bind=None)
    • OptionArg(default, options, allow_none=False, env_bind=None)
  • Conf — base class for config schemas.
  • monitor_on(fields) — decorator to watch fields and trigger methods.
  • add_dependency(parent, child) — enforce field dependency order.

Example Workflow

# config.yaml
learning_rate: 0.01
batch_size: 64
num_epochs: 20
optimizer: sgd
use_gpu: false
conf = TrainConf.parse_command_line()
# run: python train.py --config_path config.yaml
print(conf.optimizer.value())  # "sgd"

Roadmap

Stage 1 — Config files & strings ✅

  • Parse from JSON, TOML, YAML strings
  • Parse from config files
  • Convert and save configs

Stage 2 — GUI settings ✅

  • Popup browser page to set configs interactively
  • Export configs after editing

Stage 3 — Command line flags 🚧

  • Parse configs directly from CLI flags
    (e.g. --optimizer.params.learning_rate 0.01 --batch_size 64)

License

Apache License 2.0

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hyperargs-0.1.2.tar.gz (2.0 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

hyperargs-0.1.2-py3-none-any.whl (17.3 kB view details)

Uploaded Python 3

File details

Details for the file hyperargs-0.1.2.tar.gz.

File metadata

  • Download URL: hyperargs-0.1.2.tar.gz
  • Upload date:
  • Size: 2.0 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for hyperargs-0.1.2.tar.gz
Algorithm Hash digest
SHA256 b1ecd9201b16fc4bcfa079e02a3450907500d2f72c230b617837e0f3a10ae5ee
MD5 90aaa8fd1c7aad451cd4c66cc9c02888
BLAKE2b-256 379d14e042d42a244bb2264c0bf559a0905541aa0a0a8e0c768ec489195bf09e

See more details on using hashes here.

Provenance

The following attestation bundles were made for hyperargs-0.1.2.tar.gz:

Publisher: python-publish.yml on TYTTYTTYT/HyperArgs

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file hyperargs-0.1.2-py3-none-any.whl.

File metadata

  • Download URL: hyperargs-0.1.2-py3-none-any.whl
  • Upload date:
  • Size: 17.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for hyperargs-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 03ce6ee3c45d7c1858fd9f75c3afaeec074d60deeecf6e50f28ccdb8d4152ee2
MD5 91a7bd47d66c9a8727874348599593b1
BLAKE2b-256 663fb4dbf0c0382cb8118bf0d61df7fcec79e69894e411fb9c6cc4c2e86394fd

See more details on using hashes here.

Provenance

The following attestation bundles were made for hyperargs-0.1.2-py3-none-any.whl:

Publisher: python-publish.yml on TYTTYTTYT/HyperArgs

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page