TODO description
Project description
LolaML - track your ML experiments
THIS IS AN ALPHA. DON’T USE IT!
Track your machine learning experiments with LolaML, never lose any information or forget which parameter yielded which results. Lola creates a simple JSON representation of the run that contains all the information you logged. The JSON can easily be shared to collaborate with friends and colleagues. Lola strives to be non-magic and simple but configurable.
Features:
a simple logging interface
a simple representation of the logged data
works with any machine learning library
automatically creates an artifact folder for each run
automatically uploads artifacts to a remote bucket (if you want)
simple jupyter notebook dashboard (more to come…)
import lolaml as lola # Use the run context manager to start/end a run with lola.Run(project="mnist", prefix_path="data/experiments") as run: # a unique id for the run print(run.run_id) # store all artifacts (model files, images, etc.) here print(run.path) # -> data/experiments/<run_id> run.log_param("lr", 0.1) run.log_param("epochs", 10) run.log_tags("WIP", "RNN") # Create and train your model... run.log_metric("loss", loss, step=1) run.log_metric("train_acc", train_acc, step=1) run.log_metric("val_acc", val_acc, step=1) model.save(os.path.join(run.path, "model.pkl")) # After a run there is a lola_run*.json file under run.path. # It contails all the information you logged.
After the run there is a JSON file that looks something like this:
{ "project": "mnist", "run_id": "9a531da0-dc43-4dcc-8968-77fd480ff7ee", "status": "done", "path": "data/experiments/9a531da0-dc43-4dcc-8968-77fd480ff7ee", "user": "stefan", "start_time": "2019-02-16 12:49:32.782958", "end_time": "2019-02-16 12:49:32.814529", "metrics": [ { "name": "loss", "value": 1.5 "step": 1, "ts": "2019-02-16 12:49:32.813750" }, ... ], "params": { "lr": "0.1", "epochs": 10, }, "tags": ["WIP", "RNN"], "artifacts": { "data/experiments/9a531da0-dc43-4dcc-8968-77fd480ff7ee/lola_run_9a531da0-dc43-4dcc-8968-77fd480ff7ee.json": {}, ... }, "git": { "sha": "41cb4fb11b7e937c602c2282b9275200c88b8797", "status": "...", "diff": "...", }, "call_info": { "__file__": "somefile.py", "argv": [], } }
Lola can automatically upload all artifacts to a remote storage bucket for you:
with lola.run( remote_location="gs://somewhere", remote_credentials="service_account.json", ) as run: # train and log ... # All artifacts are uploaded now
The remote location can also be configured with the .lola.toml file.
Additionally, Lola offers some helpers to analyse the your experiments:
TODO add image of dashboard
Setup
Requirements
Python 3.6+ (probably works with other versions as well, but must be tested)
Installation
Install this library directly into an activated virtual environment:
$ pip install lolaml
or add it to your Poetry project:
$ poetry add lolaml
Misc
This project was generated with cookiecutter using jacebrowning/template-python. Thanks!
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file lolaml-0.0.2.tar.gz
.
File metadata
- Download URL: lolaml-0.0.2.tar.gz
- Upload date:
- Size: 13.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/0.12.11 CPython/3.6.8 Linux/4.15.0-45-generic
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | beddfd860c657a8476a2aa7f6faece600d3f6ffe98526046dc6eb1a95a708cce |
|
MD5 | 5910b47b2062d1586ebb7b3277a94cba |
|
BLAKE2b-256 | 17290f2d2195bb89548c2f5c5c32df1a81bc1f098b9f7541f68dbd40571d7d89 |
Provenance
File details
Details for the file lolaml-0.0.2-py3-none-any.whl
.
File metadata
- Download URL: lolaml-0.0.2-py3-none-any.whl
- Upload date:
- Size: 28.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/0.12.11 CPython/3.6.8 Linux/4.15.0-45-generic
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | d312207e52bfdd9a01b33b17d7b3a3ef68f5824a807af399634e7a74465724d2 |
|
MD5 | fe19f42905da4bcf2025f9c1685838a7 |
|
BLAKE2b-256 | c55080efa62baeca2f0a45f5f28307bb7dee38fb7f64235bb4cc802bff294296 |