Simple tools for logging experiments in algorithm engineering
Project description
AeMeasure
Provides a Python module for simple measurements of algorithm engineering experiments. Maybe it will also get a scheduler.
from aemeasure import Measurement, read_as_pd
with Measurement("my_database.json", capture_stdout="stdout", capture_stderr="stderr") as m:
m["instance"] = "fancy_instance"
m["size"]
m["algorithm"] = "fancy_algorithm"
m["parameters"] = "asdgfdfgsdgf"
solution = run_algorithm()
m["solution"] = solution.to_json_dict()
m["objective"] = 42
m["lower_bound"] = 13
m.save_metadata()
table = read_as_pd("my_database.json", ["instance", "size", "algorithm", "runtime"])
table.plot(x="size", y="runtime")
Following data can easily be saved:
- Runtime (enter and exit of Measurement)
- stdout/stderr
- Git Revision
- Timestamp of start
- Hostname
- Arguments
- Python-File
- Current working directory
Except of stdout and stderr, these values are automatically saved when using m.save_metadata()
.
The database currently is a simple json which is rather inefficient but works. You should not use it in parallel because then there can be collisions when writing to the database. Use different files when working in parallel!
A parallel version is planned that sets up a local server and also allows multiple workstations. It should also be easy to let it do the scheduling.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
Built Distribution
Hashes for aemeasure-0.0.17-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 3ed5c6f338a60212a54e03f39a29444312e4b7ed28edb7826b45f5cd6529f30f |
|
MD5 | 640f62481a23574fd213b8e832545c11 |
|
BLAKE2b-256 | c8eb946b5a3e0600548d9704dd2ba8bb6e66f6644e9adb8a47460954ca5128b1 |