Logger for multiprocessing applications
When using the
multiprocessing module, logging becomes less useful since
sub-processes should log to individual files/streams or there's the risk of
records becoming garbled.
This simple module implements a
Handler that when set on the root
Logger will handle tunneling the records to the main process so that
they are handled correctly.
It's currently tested in Linux and Python 2.7 & 3.6+.
Pypy3 hangs on the tests so I don't recommend using it.
Pypy appears to be working, recently.
Only works on POSIX systems and only Linux is supported. It does not work on Windows.
This library was taken verbatim from a StackOverflow post and extracted into a module so that I wouldn't have to copy the code in every project.
Later, several improvements have been contributed.
Before you start logging but after you configure the logging framework (maybe with
logging.basicConfig(...)), do the following:
import multiprocessing_logging multiprocessing_logging.install_mp_handler()
and that's it.
When using a Pool, make sure
install_mp_handler is called before the Pool is instantiated, for example:
import logging from multiprocessing import Pool from multiprocessing_logging import install_mp_handler logging.basicConfig(...) install_mp_handler() pool = Pool(...)
The consequence is that there's a low probability of the application hanging when creating new processes.
As a palliative, don't continuously create new processes. Instead, create a Pool once and reuse it.
Release history Release notifications | RSS feed
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Hashes for multiprocessing_logging-0.3.4-py2.py3-none-any.whl