Portable, reusable, strongly typed CLIs from dataclass definitions
Project description
dcargs
dcargs
is a tool for generating portable, reusable, and strongly typed CLI
interfaces from dataclass definitions.
We expose one function, parse(Type[T]) -> T
, which takes a dataclass type and
instantiates it via an argparse-style CLI interface:
import dataclasses
import dcargs
@dataclasses.dataclass
class Args:
field1: str
field2: int
args = dcargs.parse(Args)
The parse function supports dataclasses containing:
- Native types: str, int, float
- Boolean flags
- Enums (via
enum.Enum
) - Optional types
- Literal types (populates
choices
) - Sequence and list types
- Forward references (including in unions)
- Automatic helptext generation
- Nested dataclasses
- Simple nesting
- Unions over child structures (subparsers)
Very similar to datargs and simple-parsing. Comparison coming soon!
Example
The following code:
"""An argument parsing example.
Note that there are multiple possible ways to document dataclass attributes, all
of which are supported by the automatic helptext generator.
"""
import dataclasses
import enum
import dcargs
class OptimizerType(enum.Enum):
ADAM = enum.auto()
SGD = enum.auto()
@dataclasses.dataclass
class OptimizerConfig:
# Variant of SGD to use.
type: OptimizerType
# Learning rate to use.
learning_rate: float = 3e-4
# Coefficient for L2 regularization.
weight_decay: float = 1e-2
@dataclasses.dataclass
class ExperimentConfig:
experiment_name: str # Experiment name to use.
optimizer: OptimizerConfig
seed: int = 0
"""Random seed. This is helpful for making sure that our experiments are
all reproducible!"""
config = dcargs.parse(ExperimentConfig)
print(config)
Generates the following argument parser:
$ python example.py --help
usage: example.py [-h] --experiment-name EXPERIMENT_NAME --optimizer-type {ADAM,SGD} [--optimizer-learning-rate OPTIMIZER_LEARNING_RATE]
[--optimizer-weight-decay OPTIMIZER_WEIGHT_DECAY] [--seed SEED]
An argument parsing example.
Note that there are multiple possible ways to document dataclass attributes, all
of which are supported by the automatic helptext generator.
optional arguments:
-h, --help show this help message and exit
--optimizer-learning-rate OPTIMIZER_LEARNING_RATE
Learning rate to use. (float, default: 0.0003)
--optimizer-weight-decay OPTIMIZER_WEIGHT_DECAY
Coefficient for L2 regularization. (float, default: 0.01)
--seed SEED Random seed. This is helpful for making sure that our experiments are
all reproducible! (int, default: 0)
required arguments:
--experiment-name EXPERIMENT_NAME
Experiment name to use. (str)
--optimizer-type {ADAM,SGD}
Variant of SGD to use. (str)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
dcargs-0.0.1.tar.gz
(8.2 kB
view hashes)
Built Distribution
dcargs-0.0.1-py3-none-any.whl
(8.8 kB
view hashes)