Skip to main content

misc tools for configs, logs

Project description

hao

configurations, logs and others.

install

pip install hao

precondition

The folder contained any of the following files (searched in this very order) will be treated as project root path.

  • requirements.txt
  • VERSION
  • conf
  • setup.py
  • .idea
  • .git

If your project structure does NOT conform to this, it will not work as expected.

features

config

It will try to load YAML config file from conf folder

.                               # project root
├── conf
│   ├── config-{env}.yml        # if `export env=abc`, will raise error if not found
│   ├── config-{hhostname}.yml  # try to load this file, then the default `config.yml`
│   └── config.yml              # the default config file that should always exist
├── requirements.txt            # every project should have this file
├── VERSION                     # hao.versions.get_version() will try to read this file
├── .git

In following order:

if os.environ.get("env") is not None:
    try_to_load(f'config-{env}.yml', fallback='config.yml')                   # echo $env
else:
    try_to_load(f'config-{socket.gethostname()}.yml', fallback='config.yml')  # echo hostname

Say you have the following content in your config file:

# config.yml
es:
  default:
    host: 172.23.3.3
    port: 9200
    indices:
      - news
      - papers

The get the configured values in your code:

import hao
es_host = hao.config.get('es.default.host')          # str
es_port = hao.config.get('es.default.port')          # int
indices = hao.config.get('es.default.indices')       # list
...

logs

Set the logger levels to filter logs

e.g.

# config.yml
logging:
  __main__: DEBUG
  transformers: WARNING
  lightning: INFO
  pytorch_lightning: INFO
  elasticsearch: WARNING
  tests: DEBUG
  root: INFO                        # root level

Settings for logger:

# config.yml
logger:
  format: "%(asctime)s %(levelname)-7s %(name)s:%(lineno)-4d - %(message)s"  # this is the built-in format
  file:                         # using time-based-rotating file logger
    dir: ~/.logs/spanner/       # log parent folder
    enabled: false              # depends on `logger.file.dir`
    rotate:
      count: 3                  # keep n rotate log files
      when: d                   # rotate log files every `d` (day)

Declare and user the logger

import hao
LOGGER = hao.logs.get_logger(__name__)

LOGGER.debug('message')
LOGGER.info('message')
LOGGER.warnning('message')
LOGGER.error('message')
LOGGER.exception(err)

namespaces

import hao
from hao.namespaces import from_args, attr

@from_args
class ProcessConf(object):
    file_in = attr(str, required=True, help="file path to process")
    file_out = attr(str, required=True, help="file path to save")
    tokenizer = attr(str, required=True, choice=('wordpiece', 'bpe'))


from argparse import Namespace
from pytorch_lightning import Trainer
@from_args(adds=Trainer.add_argparse_args)
class TrainConf(Namespace):
    root_path_checkpoints = attr(str, default=hao.paths.get_path('data/checkpoints/'))
    dataset_train = attr(str, default='train.txt')
    dataset_val = attr(str, default='val.txt')
    dataset_test = attr(str, default='test.txt')
    batch_size = attr(int, default=128, key='train.batch_size')                          # key means try to load from config.yml by the key
    task = attr(str, choices=('ner', 'nmt'), default='ner')
    seed = attr(int)
    epochs = attr(int, default=5)

Where attr is a wrapper for argpars.add_argument()

Usage 1: overwrite the default value from command line

python -m your_module --task=nmt

Usage 2: overwrite the default value from constructor

train_conf = TrainConf(task='nmt')

Value lookup order:

  • command line
  • constructor
  • config yml if key specified in attr
  • default if specified in attr

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hao-3.2.0.tar.gz (79.9 kB view details)

Uploaded Source

Built Distribution

hao-3.2.0-py3-none-any.whl (84.0 kB view details)

Uploaded Python 3

File details

Details for the file hao-3.2.0.tar.gz.

File metadata

  • Download URL: hao-3.2.0.tar.gz
  • Upload date:
  • Size: 79.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/3.10.1 pkginfo/1.7.0 requests/2.24.0 requests-toolbelt/0.9.1 tqdm/4.48.0 CPython/3.7.9

File hashes

Hashes for hao-3.2.0.tar.gz
Algorithm Hash digest
SHA256 d9d759a9ee77f0dcabab7b62822675f33574bcfaecd7d23011ddb324e44ce397
MD5 b067e619953e9427c8d163c4e0775118
BLAKE2b-256 9bb94b279c6fe2e5ef8b13887fcc9ba735d10cf237eedf62590fcb2c222fc367

See more details on using hashes here.

File details

Details for the file hao-3.2.0-py3-none-any.whl.

File metadata

  • Download URL: hao-3.2.0-py3-none-any.whl
  • Upload date:
  • Size: 84.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/3.10.1 pkginfo/1.7.0 requests/2.24.0 requests-toolbelt/0.9.1 tqdm/4.48.0 CPython/3.7.9

File hashes

Hashes for hao-3.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 55f0896d6d4ea8e3decf0048ed09c94b723e6b60f59a9e7eb1cbdd50ed6853ef
MD5 0c644156036d6540ec90d6859db3fba8
BLAKE2b-256 dedeb6d2ec7ad6e70cfd03c90700e1227a46dde521adfb651c3e00adb0fc4437

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page