Skip to main content

This project is an experiment record manager for python based on SQLite DMS, which can help you efficiently save your experiment settings and results for later analysis.

Project description

pyerm (Python Experiment Record Manager)

This project is an experiment record manager for python based on SQLite DMS, which can help you efficiently save your experiment settings and results for later analysis.

In the current version, all operations will be performed locally.

Introduction

This project is used to save the settings and results of any experiment consists of three parts: method, data, task.

Besides, the basic information and detail information of the experiment will also be recorded.

All data you want can be efficiently saved by API provided without knowing the detail implement, but I suggest reading the table introduction for further dealing with the records.

Install Introduction

All you need to do for using the python package is using the following command:

pip install pyerm

Workflow Introduction

Table Define & Init

Before starting the experiment, you need to init the tables you need for the experiment by three init function: data_init(), method_init(), task_init().

You need to input the name and experiment parameter for the first two. The function can automatically detect the data type, and they will create the table if not exist. If you want to define the DMS type yourself, you can input a param_def_dict to these function, whose key means column name, and value means column SQL type define, like {"people", "TEXT DEFAULT NULL"}.

Experiment

The experiment recorder mainly consists of four parts, experiment_start(), experiment_over(), experiment_failed(), detail_update(). From the name of these function, you can easily know where and how to use them.

experiment_start() saves the basic experiment information before experiment formally starts and will set the experiment status to running.

experiment_over() saves the experiment results after experiment ends and will set the experiment status to over.

experiment_failed() saves the reason why experiment failed and will set the experiment status to failed.

detail_update() saves the intermediate results. It's optional, and if you never use it and don't manually set the define dict, the detail table may not be created.

Table Introduction

Experiment Table

All experiments' basic information will be recorded in the experiment_list table. It contains the description of the method, the method (with its setting id) & data (with its setting id) & task, the start & end time of the experiment, useful & total time cost, tags, experimenters, failure reason and the experiment status, each experiment is identified by the experiment id.

Method Table

Each Method Table is identified by its corresponding method name, and any different method will be assigned a different table for saving its different parameter setting, such as method-specific hyper-parameters, etc. The table is used to save different parameter for every method.

The only necessary column for method table is the method setting id, which will be set automatically, other specific column is set by users.

Data Table

Each Data is identified by its corresponding data name, and any different data will be assigned a different table for saving its different parameter setting, such as data-specific preprocess parameters, etc. The table is used to save different parameter for every data.

The only necessary column for method table is the data setting id, which will be set automatically, other specific column is set by users.

Result Table

Each Result Table is identified by its corresponding task name, and different tasks will be assigned with different tables for saving its different experiment results, such as accuracy for classification, normalized mutual information for clustering.

Besides, this table offers several columns for saving image in order for latter visualization.

The only necessary column for result table is the experiment id, other specific column is set by users.

Detail Table

Each Detail Table is identified by its corresponding method name, different methods are related to different detail table. During an experiment, you may need to record some intermediate results, which can be saved in this table.

The only necessary column for detail table is the detail id (which can be set automatically) and the experiment id, other specific column is set by users.

Future Plan

  • Some Scripts For Better Usage
  • Experiment Summary Report Generate
  • Web UI Visualize & Commonly Used Analyze Fuctions

Contact

My email is yx_shao@qq.com. If you have any question or advice, please contact me.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pyerm-0.1.5.tar.gz (9.7 kB view details)

Uploaded Source

Built Distribution

pyerm-0.1.5-py3-none-any.whl (12.5 kB view details)

Uploaded Python 3

File details

Details for the file pyerm-0.1.5.tar.gz.

File metadata

  • Download URL: pyerm-0.1.5.tar.gz
  • Upload date:
  • Size: 9.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.12.2

File hashes

Hashes for pyerm-0.1.5.tar.gz
Algorithm Hash digest
SHA256 65f907844ddf6437be9c8c00b67565ac912ba9b1c41a931c1fc88ca19f5d0a4b
MD5 b6fa81c599dd1ff91a70d4938d65b8ce
BLAKE2b-256 5c88486535d089152e9cd67aeae3eabfed57111f6fe9a4f241e0376a21167400

See more details on using hashes here.

File details

Details for the file pyerm-0.1.5-py3-none-any.whl.

File metadata

  • Download URL: pyerm-0.1.5-py3-none-any.whl
  • Upload date:
  • Size: 12.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.12.2

File hashes

Hashes for pyerm-0.1.5-py3-none-any.whl
Algorithm Hash digest
SHA256 5e85c73c2c25b1024e52f0afbea6aba4405c19491e63f77f749034feb3e3efd4
MD5 a40ef82f87aa3f419088978ad62a6044
BLAKE2b-256 98e95a056feaca12fdfce83fcb1e419e8e0f719fa155e2dd3eb54c7301ee4cee

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page