Skip to main content

A practice example package

Project description

Training scripts

Dataset-independence

  • train.py: train one model (eg. beta-vae, IWAE, bivae) on one specific hyperparamter config

    • E.g. Train BiVAE on osmnx_roads data of the following cities, with images of bgcolors
    nohup python train.py --model_name="bivae" \
    --latent_dim=10 --hidden_dims 32 64 128 256 --adv_dim 32 32 32 --adv_weight 1.0 \
    --data_root="/data/hayley-old/osmnx_data/images" \
    --data_name="osmnx_roads" \
    --cities 'la' 'charlotte' 'vegas' 'boston' 'paris' \
         'amsterdam' 'shanghai' 'seoul' 'chicago' 'manhattan' \
         'berlin' 'montreal' 'rome' \
    --bgcolors "k" "r" "g" "b" "y" --n_styles=5 \
    --zooms 14 \
    --gpu_id=2 --max_epochs=300   --terminate_on_nan=True  \
    -lr 3e-4 -bs 32 \
    --log_root="/data/hayley-old/Tenanbaum2000/lightning_logs/2021-05-18/" &
    
    • E.g.: Train BIVAE on Rotated MNIST of optionally specified subset (given as a filepath to .npy file containing the indices from the original Training MNIST data)
    ## Specify which indices to use among the MNIST -- comparable to DIVA's experiments
    ## change 0 to anything inbtw 0,...,9
    nohup python train.py --model_name="bivae" \
    --latent_dim=128 --hidden_dims 32 64 64 64 --adv_dim 32 32 32 \
    --data_name="multi_rotated_mnist" --angles -45 0 45 --n_styles=3 \
    --selected_inds_fp='/data/hayley-old/Tenanbaum2000/data/Rotated-MNIST/supervised_inds_0.npy' \
    --gpu_id=2
    
    • E.g.: Train Bivae on multi styles of maptiles from specified cities
    # Train BiVAE on Multi Maptiles MNIST
    nohup python train.py --model_name="bivae" \
    --latent_dim=10 --hidden_dims 32 64 128 256 --adv_dim 32 32 32 --adv_weight 15.0 \
    --data_name="multi_maptiles" \
    --cities la paris \
    --styles CartoVoyagerNoLabels StamenTonerBackground --n_styles=3 \
    --zooms 14 \
    --gpu_id=2 --max_epochs=400   --terminate_on_nan=True  \
    -lr 3e-4 -bs 32 \
    --log_root="/data/hayley-old/Tenanbaum2000/lightning_logs/2021-01-23/" &
    

Hyperparameter tuning using Ray Tune

  • tune_asha.py: Use Tune's AsyncHyperBandScheduler to search hyperparameter space more efficiently. Use --tune_metric to specify the value of tune.run's metric argument, e.g. --tune_metric loss for
  • tune_asha_with_beta_scheduler.py:
  • `

Dataset-specific

Rotated MNIST

  • tune_asha_mnists.py

osmnx_roads

  • tune_asha_osmnx_roads.py

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

reprlearn-0.0.1.tar.gz (65.3 kB view details)

Uploaded Source

Built Distribution

reprlearn-0.0.1-py3-none-any.whl (107.1 kB view details)

Uploaded Python 3

File details

Details for the file reprlearn-0.0.1.tar.gz.

File metadata

  • Download URL: reprlearn-0.0.1.tar.gz
  • Upload date:
  • Size: 65.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.6.1 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.59.0 CPython/3.8.3

File hashes

Hashes for reprlearn-0.0.1.tar.gz
Algorithm Hash digest
SHA256 897b07faab14676e28a64de1acca8615a3a150cc89e86c7bbd7d0bf68baae648
MD5 b6e88d98a5f41a0e06fc4811d9173c07
BLAKE2b-256 615addbfdf47de7f24580bcd0984dcd82ae3aef1aecceb9ab0a28e0fcfb8eae6

See more details on using hashes here.

File details

Details for the file reprlearn-0.0.1-py3-none-any.whl.

File metadata

  • Download URL: reprlearn-0.0.1-py3-none-any.whl
  • Upload date:
  • Size: 107.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.6.1 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.59.0 CPython/3.8.3

File hashes

Hashes for reprlearn-0.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 424504eb5ae9266c7e0037f25b5a26d834b23ffe4bdd5db547007f5f815b1b1a
MD5 101b93da9acb073f337df34ca471d458
BLAKE2b-256 8255ca61e8846bee044c2dcb3b3f588c086bdab22bca46c65c52c8f132ecf115

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page