Skip to main content

Tiny configuration library tailored for Deep Learning project and the Ride.

Project description

Co-Rider

License Code style: black codecov

Tiny configuration library tailored for Deep Learning project and the Ride library.

pip install corider

Organising configurations and arguments for Deep Learning projects

Keeping track of, merging and exposing configurations as arguments can be cumbersome and introduces a lot of boiler-plate code. This tiny library aims to introduce a configuration structure, that will fit many Deep Learning projects.

A basic configuration is defined as follows:

from corider import Configs

c1 = Configs()
c1.add(
    name="learning_rate",
    type=int,
    default=2,
    strategy="loguniform",
    description="Learning rate for optimizer",
    choices=(1e-8, 1),
)
c1.add(
    name="optimizer",
    type=str,
    default="sgd",
    strategy="constant",
    description="Optimizer to use.",
    choices=["sgd", "adam"],
)

Argparse

Co-Rider is fully compartible with argparse and can both load and dump argparse configurations:

# argparse_example.py
from argparse import ArgumentParser
from corider import Configs

parser = ArgumentParser(add_help=True)
parser.add_argument(
    "--defined_with_argparse",
    default=42,
    choices=(42, 1337),
    type=int,
    help="Nonsensical parameter defined for demo purposes.",
)

c2 = Configs.from_argument_parser(parser)

c2.add(
    name="defined_with_corider",
    type=int,
    default="lit",
    description="Another parameter for demo purposes",
    choices=["lit", "woke"],
)

new_parser = c2.add_argparse_args(ArgumentParser(add_help=True))

args = new_parser.parse_args()

# Do somethin with the args

Use from shell as usual:

$ python argparse_example.py --help
usage: argparse_example.py [-h] [--defined_with_argparse {42,1337}]
                           [--defined_with_corider {lit,woke}]

optional arguments:
  -h, --help            show this help message and exit
  --defined_with_argparse {42,1337}
                        Nonsensical parameter defined for demo purpose.
                        (Default: 42)
  --defined_with_corider {lit,woke}
                        Another parameter for demo purpose (Default: lit)

Ray Tune

By now you may have wodered about the strategy parameter. This parameter is intended for hyperparameter optimizers to indicate which sampling strategy to employ during hyperparameter search.

Four strategies are available:

  • "constant": Parameter is not searchable and must be selected elsewhere, e.g. using argparse
  • "choice": Choose randomly from a list/set/tuple/range of parameters, e.g. ["lit", "woke"]
  • "uniform": Pick values at random from an interval, e.g. (0, 10)
  • "loguniform": Pick values in a log uniform manner, e.g. (1e-8, 1)

For now, an automatic export to Ray[Tune] is included, which can be used as follows:

from ray import tune

# Configs which had strategy "constant" can be added as argparse args
parser = c.add_tune_argparse_args(ArgumentParser())
args = parser.parse_args()

# Other parameters are exported in a Tune-compatible format
tune_config = c.tune_config()

# Run search
analysis = tune.run(
    your_training_function,
    config=tune_config,
    ... # Other tune.run parameters
)

Argument addition and subtracktion

Co-Rider can add and subtract configs as needed:

c1 = ...  # As defined above (has: "learning_rate", "optimizer")

c2 = ...  # As defined above (has: "defined_with_argparse", "defined_with_corider")

c3 = Configs()
c3.add(
    name="learning_rate",  # Also defined in c1
    type=int,
    default=2,
    strategy="loguniform",
    description="Learning rate for optimizer",
    choices=(1e-8, 1),
)

# Has: "optimizer," "defined_with_argparse", "defined_with_corider"
c4 = c1 + c2 - c3  

Load configuration from file

A configuration can be loaded from either .yaml or .json formatted files:

# example_conf.yaml
dropout:
  type: float
  strategy: choice
  choices: [0.0, 0.1, 0.2, 0.3, 0.4]
learning_rate:
  type: float
  strategy: loguniform
  choices: [0.01, 0.5]
weight_decay:
  type: float
  strategy: loguniform
  choices: [0.000001, 0.001]
c = Configs.from_file("example_conf.yaml")

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

corider-0.1.5.tar.gz (5.8 kB view details)

Uploaded Source

Built Distribution

corider-0.1.5-py3-none-any.whl (9.8 kB view details)

Uploaded Python 3

File details

Details for the file corider-0.1.5.tar.gz.

File metadata

  • Download URL: corider-0.1.5.tar.gz
  • Upload date:
  • Size: 5.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.7.0 requests/2.25.1 setuptools/52.0.0.post20210125 requests-toolbelt/0.9.1 tqdm/4.56.0 CPython/3.7.9

File hashes

Hashes for corider-0.1.5.tar.gz
Algorithm Hash digest
SHA256 8733b9f044c0176b5b88d0c1853b99fa7ca755fd2790be67954e84a09e62a461
MD5 0250a8c77516f1bf6a17308eb6ba84a4
BLAKE2b-256 ef4e4b9e0dac0fec93d43a26ad5b746ba5e004edb3b4209b252c3d55ed885fac

See more details on using hashes here.

File details

Details for the file corider-0.1.5-py3-none-any.whl.

File metadata

  • Download URL: corider-0.1.5-py3-none-any.whl
  • Upload date:
  • Size: 9.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.7.0 requests/2.25.1 setuptools/52.0.0.post20210125 requests-toolbelt/0.9.1 tqdm/4.56.0 CPython/3.7.9

File hashes

Hashes for corider-0.1.5-py3-none-any.whl
Algorithm Hash digest
SHA256 66da435b0ced00c235863957a9abffe9bd17197940e4ac83cd7b0eecaf1b0ba5
MD5 c18b61dc7bec53186a6ad446321f3eba
BLAKE2b-256 c7d319b495f0fcdd6e7d96b0a8cef83c860ca7097f450062aae2443c1f81f50f

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page