Skip to main content

Deep Learning framework for fast and clean research development with Pytorch

Project description

Kerosene

Kerosene is a high-level deep Learning framework for fast and clean research development with Pytorch - see the doc for more details.. Kerosene let you focus on your model and data by providing clean and readable code for training, visualizing and debugging your achitecture without forcing you to implement rigid interface for your model.

Out of The Box Features

  • Basic training logic and user defined trainers
  • Fine grained event system with multiple handlers
  • Multiple metrics and criterions support
  • Automatic configuration parsing and model instantiation
  • Automatic support of mixed precision with Apex and dataparallel training
  • Automatic Visdom logging
  • Integrated Ignite metrics and Pytorch criterions

MNIST Example

Here is a simple example that shows how easy and clean it is to train a simple network. In very few lines of code, the model is trained using mixed precision and you got Visdom + Console logging automatically. See full example there: MNIST-Kerosene

if __name__ == "__main__":
    logging.basicConfig(level=logging.INFO)
    CONFIG_FILE_PATH = "config.yml"

    model_trainer_config, training_config = YamlConfigurationParser.parse(CONFIG_FILE_PATH)

    train_loader = DataLoader(torchvision.datasets.MNIST('./files/', train=True, download=True, transform=Compose(
        [ToTensor(), Normalize((0.1307,), (0.3081,))])), batch_size=training_config.batch_size_train, shuffle=True)

    test_loader = DataLoader(torchvision.datasets.MNIST('./files/', train=False, download=True, transform=Compose(
        [ToTensor(), Normalize((0.1307,), (0.3081,))])), batch_size=training_config.batch_size_valid, shuffle=True)

    visdom_logger = VisdomLogger(VisdomConfiguration.from_yml(CONFIG_FILE_PATH))

    # Initialize the model trainers
    model_trainer = ModelTrainerFactory(model=SimpleNet()).create(model_trainer_config)

    # Train with the training strategy
    SimpleTrainer("MNIST Trainer", train_loader, test_loader, None, model_trainer, RunConfiguration(use_amp=False)) \
        .with_event_handler(PlotMonitors(every=500, visdom_logger=visdom_logger), Event.ON_BATCH_END) \
        .with_event_handler(PlotAvgGradientPerLayer(every=500, visdom_logger=visdom_logger), Event.ON_TRAIN_BATCH_END) \
        .with_event_handler(PrintTrainingStatus(every=100), Event.ON_BATCH_END) \
        .train(training_config.nb_epochs)

Events

Event Description
ON_TRAINING_BEGIN At the beginning of the training phase
ON_TRAINING_END At the end of the training phase
ON_VALID_BEGIN At the beginning of the validation phase
ON_VALID_END At the end of the validation phase
ON_TEST_BEGIN At the beginning of the test phase
ON_TEST_END At the end of the test phase
ON_EPOCH_BEGIN At the beginning of each epoch (training, validation, test)
ON_EPOCH_END At the end of each epoch (training, validation, test)
ON_TRAIN_EPOCH_BEGIN At the beginning of each training epoch
ON_TRAIN_EPOCH_END At the end of each training epoch
ON_VALID_EPOCH_BEGIN At the beginning of each validation epoch
ON_VALID_EPOCH_END At the end of each validation epoch
ON_TEST_EPOCH_BEGIN At the beginning of each test epoch
ON_TEST_EPOCH_END At the end of each test epoch
ON_BATCH_BEGIN At the beginning of each batch (training, validation, test)
ON_BATCH_END At the end of each batch (training, validation, test)
ON_TRAIN_BATCH_BEGIN At the beginning of each train batch
ON_TRAIN_BATCH_END At the end of each train batch
ON_VALID_BATCH_BEGIN At the beginning of each validation batch
ON_VALID_BATCH_END At the end of each validation batch
ON_TEST_BATCH_BEGIN At the beginning of each test batch
ON_TEST_BATCH_END At the end of each test batch
ON_FINALIZE Before the end of the process

Handlers

  • PrintTrainingStatus (Console)
  • PrintMonitors (Console)
  • PlotMonitors (Visdom)
  • PlotLosses (Visdom)
  • PlotMetrics (Visdom)
  • PlotCustomVariables (Visdom)
  • PlotLR (Visdom)
  • PlotAvgGradientPerLayer (Visdom)
  • Checkpoint
  • EarlyStopping

Contributing

How to contribute ?

  • Create a branch by feature and/or bug fix
  • Get the code
  • Commit and push
  • Create a pull request

Branch naming

Feature branch

feature/ [Short feature description] [Issue number]

Bug branch

fix/ [Short fix description] [Issue number]

Commits syntax:

Adding code:

+ Added [Short Description] [Issue Number]

Deleting code:

- Deleted [Short Description] [Issue Number]

Modifying code:

* Changed [Short Description] [Issue Number]

Merging code:

Y Merged [Short Description] [Issue Number]

Icons made by Freepik from www.flaticon.com is licensed by CC 3.0 BY

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

torch-kerosene-0.2.4.tar.gz (41.5 kB view details)

Uploaded Source

File details

Details for the file torch-kerosene-0.2.4.tar.gz.

File metadata

  • Download URL: torch-kerosene-0.2.4.tar.gz
  • Upload date:
  • Size: 41.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/45.1.0 requests-toolbelt/0.9.1 tqdm/4.33.0 CPython/3.6.2

File hashes

Hashes for torch-kerosene-0.2.4.tar.gz
Algorithm Hash digest
SHA256 a8698f85053e5159dec0b207dc66c79fe2fd2c72614cd3f7e75d2927ded0cf6f
MD5 4a6519142e7cfeaa24942ec17c471cae
BLAKE2b-256 022dc7db2a3c04cf9d49caf8c57738bacd61f066f46860ab691ffd4e643aaf6a

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page