Skip to main content

TorchConfig is a Python package that simplifies configuring PyTorch.

Project description

TorchConfig

TorchConfig is a Python package that simplifies configuring PyTorch.

Suppose that you want to test multiple optimizers to find which optimizer works best with your model. Here is one way you could achieve this:

if CONFIG["optimizer_name"] == "SGD":
    optimizer = optim.SGD(
        net.parameters(),
        lr=CONFIG["optimizer_lr"],
        momentum=CONFIG["optimizer_momentum"],
        dampening=CONFIG["optimizer_dampening"],
        weight_decay=CONFIG["optimizer_weight_decay"],
        nesterov=CONFIG["optimizer_nesterov"],
    )
...
elif CONFIG["optimizer_name"] == "Adam":
    optimizer = optim.Adam(
        net.parameters(),
        lr=CONFIG["optimizer_lr"],
        betas=CONFIG["optimizer_betas"],
        eps=CONFIG["optimizer_eps"],
        weight_decay=CONFIG["optimizer_weight_decay"],
        amsgrad=CONFIG["optimizer_amsgrad"],
    )
}

With TorchConfig, this is just one line!

optimizer = torchconfig.get_optimizer_from_dict(net.parameters(), CONFIG)

Installation

pip install torchconfig

How to Use

You can specify any optimizer or lr_scheduler by specifying its name through a dictionary key-value pair or an argument.

optimizer_config = {"name": "SGD", "lr": 0.1 }
optimizer = torchconfig.get_optimizer_from_args(net.parameters(), name="SGD", lr=0.1)
# or
optimizer = torchconfig.get_optimizer_from_args(net.parameters(), **optimizer_config)
# or
optimizer = torchconfig.get_optimizer_from_dict(net.parameters(), optimizer_config)
lr_scheduler_config = { "name": "CyclicLR", "base_lr": 0.01, "max_lr": 1 }
lr_scheduler = torchconfig.get_lr_scheduler_from_args(optimizer, **CONFIG["lr_scheduler"])
# or
lr_scheduler = torchconfig.get_lr_scheduler_from_args(optimizer, name="CyclicLR", base_lr=0.01, max_lr=1)
# or
lr_scheduler = torchconfig.get_lr_scheduler_from_dict(optimizer, CONFIG["lr_scheduler"])

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

torchconfig-0.1.3.tar.gz (3.1 kB view details)

Uploaded Source

Built Distribution

torchconfig-0.1.3-py3-none-any.whl (3.9 kB view details)

Uploaded Python 3

File details

Details for the file torchconfig-0.1.3.tar.gz.

File metadata

  • Download URL: torchconfig-0.1.3.tar.gz
  • Upload date:
  • Size: 3.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/46.4.0 requests-toolbelt/0.9.1 tqdm/4.36.1 CPython/3.7.4

File hashes

Hashes for torchconfig-0.1.3.tar.gz
Algorithm Hash digest
SHA256 d2a9706a425e8bc6107545f6c4c723572eedce262454fdc3612e0ee9c6d33d66
MD5 5581521c866f4c0c5cca4611a576b475
BLAKE2b-256 f995871685ec4d41a8307cc3b694116409febc8aef2d586571b8b47aa9dd7911

See more details on using hashes here.

File details

Details for the file torchconfig-0.1.3-py3-none-any.whl.

File metadata

  • Download URL: torchconfig-0.1.3-py3-none-any.whl
  • Upload date:
  • Size: 3.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/46.4.0 requests-toolbelt/0.9.1 tqdm/4.36.1 CPython/3.7.4

File hashes

Hashes for torchconfig-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 624173b4e5b3a0a2c8c15bb86d1eb102303664ac2cc7e82c7bc82de7eb70faee
MD5 948e5df2b7b3ed3685e83a3a8b83d034
BLAKE2b-256 7987d5ff3fc8b1c768a80a62461b8a00b0920becdaa3d6c96f875593ded35c97

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page