No project description provided
Project description
paramflow
A parameter and configuration management library motivated by training machine learning models
and managing configuration for applications that require profiles and layered parameters.
paramflow is designed for flexibility and ease of use, enabling seamless parameter merging
from multiple sources. It also auto-generates a command-line argument parser and allows for
easy parameter overrides.
Features
- Layered configuration: Merge parameters from files, environment variables, and command-line arguments.
- Immutable dictionary: Provides a read-only dictionary with attribute-style access.
- Profile support: Manage multiple sets of parameters. Layer the chosen profile on top of the default profile.
- Layered meta-parameters:
paramflowloads its own configuration using layered approach. - Convert types: Convert types during merging using target parameters as a reference for type conversions.
- Generate argument parser: Use parameters defined in files as a reference for generating
argparseparser.
Usage
Install:
pip install paramflow
Install with .env support:
pip install paramflow[dotenv]
import paramflow as pf
params = pf.load(source='dqn_params.toml')
print(params.lr)
Meta-parameter Layering
Meta-parameter layering controls how paramflow.load reads its own configuration.
Layering order:
paramflow.loadarguments.- Environment variables (default prefix
P_). - Command-line arguments (via
argparse).
Activate profile using command-line arguments:
python print_params.py --profile dqn-adam
Activate profile using environment variable:
P_PROFILE=dqn-adam python print_params.py
Parameter Layering
Parameter layering merges parameters from multiple sources.
Layering order:
- Configuration files (
.toml,.yaml,.ini,.json). .envfile.- Environment variables (default prefix
P_). - Command-line arguments (via
argparse).
Layering order can be customized via source argument to param.flow.
params = pf.load(source=['params.toml', 'env', '.env', 'args'])
Overwrite parameter value:
python print_params.py --profile dqn-adam --lr 0.0002
ML hyper-parameters profiles
params.toml
[default]
learning_rate = 0.00025
batch_size = 32
optimizer_class = 'torch.optim.RMSprop'
optimizer_kwargs = { momentum = 0.95 }
random_seed = 13
[adam]
learning_rate = 1e-4
optimizer_class = 'torch.optim.Adam'
optimizer_kwargs = {}
Activating adam profile
python app.py --profile adam
will result in overwriting default learning rate with 1e-4, default optimizer class with torch.optim.Adam
and default optimizer arguments with and empty dict.
Development stages profiles
Profiles can be used to manage software development stages.
params.toml:
[default]
debug = true
database_url = "mysql://user:pass@localhost:3306/myapp"
[dev]
database_url = "mysql://user:pass@dev.app.example.com:3306/myapp"
[prod]
debug = false
database_url = "mysql://user:pass@app.example.com:3306/myapp"
Activate prod profile:
export P_PROFILE=dev
python app.py
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file paramflow-0.1.5.tar.gz.
File metadata
- Download URL: paramflow-0.1.5.tar.gz
- Upload date:
- Size: 20.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.12.8
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
b9bfab218ee2c4beae604953e4f548e8dfdf3e008fcf6c55abb531e6c514f195
|
|
| MD5 |
9da2cfe84d30a416294b504da0db31e3
|
|
| BLAKE2b-256 |
2d8e353b46b09412105b44979b003c2e79375e3823290a94f5537f64a6f167da
|
File details
Details for the file paramflow-0.1.5-py3-none-any.whl.
File metadata
- Download URL: paramflow-0.1.5-py3-none-any.whl
- Upload date:
- Size: 7.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.12.8
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
acd3dd7128548074ed75dc97aa5525df002dbba23d45af771a52525f686f95c4
|
|
| MD5 |
c09643f6dc4fe369f0f5e4a38f838304
|
|
| BLAKE2b-256 |
9a62b9bb3a9fa154428bd665ee8ee4962640b1a890802d9f6fac62a8cc1f58f8
|