Skip to main content

Automatically create a config of hyper-parameters from global variables

Project description

Contributors Stargazers Issues PyPi version Python 2 Python 3


Logo

AutoHparams

Automatically create a hparams of hyper-parameters from global variables!


View Demo · Report Bug

About The Project

Have you ever run some machine learning training run for hours, just to realize days later that you forgot to save some hyper-parameters? So you need to pray for your memory to be good, or re-run everything?

AutoHparams is a tool that dumps every variable in the global scope of your code to a dictionary that you can save using your favorite tool. This helps to avoid 80% of the missed hyper-parameters situations.

Getting Started

Using AutoHparams is a one-liner.

Install AutoHparams with pip :

pip install autohparams

Import it in your code, by adding this line :

from autohparams import get_auto_hparams

To get a hparams dictionnary, just do:

hparams = get_auto_hparams(globals())

Advanced tip
By default, get_auto_hparams ignores variables whose name starts with an underscore _. This is convenient to filter the variables you want to include in the hparams.
For example:

lr = 0.001  # we want to report learning rate
bs = 64     # we want to report batch size
_gpu = 0    # we don't want to report the gpu used
hparams = get_auto_hparams(globals())

Usage

You can now use it in any framework you want, for example:
Tensorboard

import tensorflow as tf
from tensorboard.plugins.hparams import api as hp

with tf.summary.create_file_writer('logs/hparam_tuning').as_default():
  hp.hparams(hparams)

MLflow

import mlflow

with mlflow.start_run():
  mlflow.log_params(hparams)

Weights & Biases (wandb)

import wandb

wandb.init(hparams=hparams)

Comet.ml

from comet_ml import Experiment

experiment = Experiment()
experiment.log_parameters(hparams)

Neptune.ai

import neptune.new as neptune

run = neptune.init()

run['parameters'] = hparams

Pytorch Lightning

import pytorch_lightning as pl

trainer = pl.Trainer(logger=...)
trainer.logger.log_hyperparams(hparams)

Guild AI

import guild

guild.run(hparams=hparams)

Polyaxon

import polyaxon_sdk

api_client = polyaxon_sdk.ApiClient()
api_client.create_hyper_params(run_uuid='uuid-of-your-run', body=hparams)

ClearML

from clearml import Task

task = Task.init()
task.set_parameters(hparams)

Kubeflow

from kubeflow.metadata import metadata

store = metadata.Store()
store.log_metadata(hparams)

Even faster import

If you like python wizardry, you can also import and use autohparams as a function:

import autohparams
config = autohparams(globals())

One cannot do easier!

(back to top)

Contributing

Contributions are welcome!

Roadmap/todo

Non-Code contribution :

Task Importance Difficulty Contributor on it Description
Adding documentation 4/5 1/5 NOBODY Write basic tutorials with real-life scenarios, write a wiki for other contributors to better understand the functioning of the library.

For every todo, just click on the link to find the discussion where I describe how I would do it.
See the discussions for a full list of proposed features (and known issues).

(back to top)

How to contribute

Contributing is an awesome way to learn, inspire, and help others. Any contributions you make are greatly appreciated, even if it's just about styling and best practices.

If you have a suggestion that would make this project better, please fork the repo and create a pull request.
Don't forget to give the project a star! Thanks again!

  1. Fork the Project
  2. Create your Feature Branch (git checkout -b feature/YourAmazingFeature)
  3. Commit your Changes (git commit -m 'Add some AmazingFeature')
  4. Push to the Branch (git push origin feature/AmazingFeature)
  5. Open a Pull Request

Authors

This library was created by Nicolas MICAUX.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

autohparams-0.1.0.tar.gz (5.3 kB view details)

Uploaded Source

File details

Details for the file autohparams-0.1.0.tar.gz.

File metadata

  • Download URL: autohparams-0.1.0.tar.gz
  • Upload date:
  • Size: 5.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.6

File hashes

Hashes for autohparams-0.1.0.tar.gz
Algorithm Hash digest
SHA256 b2d4940fb83c9962743bfaaa1f0a11de57a4946f02b195375cd0c8687194bf79
MD5 2c807945aff45215076ca0ad6007c5f2
BLAKE2b-256 48be1c3c5adea917b19091f529053285e76e93c1746e7f2ba72b07fb42efc439

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page