Skip to main content

multi process safe log file handler,both time and size rotate,benchmark fast than concurrent_log_handler 100 times

Project description

nb_log_file_handler

multi process safe log file handler,both time and size rotate,benchmark fast than concurrent_log_handler 100 times

nb_log_file_handler 是多进程安全切割,同时按时间和大小切割的FileHandler,性能远超 concurrent_log_handler.ConcurrentRotatingFileHandler

安装

pip install nb_log_file_handler

nb_log_file_handler 实现原理,

nb_log_file_handler 在win上采用每隔0.1秒批量写入文件,atexit钩子对程序即将结束后的剩余待写入的消息写到文件中。 linux的文件io性能本身比较好,加上fork 子进程不支持 atexit 触发执行,所以linux上使用单个消息就写入。

nb_log_file_handler 性能远超 concurrent_log_handler.ConcurrentRotatingFileHandler

1、nb_log_file_handler使用方式:

代码如下,和filehandler用法相同,导入 NbLogFileHandler

import multiprocessing
import logging
import time
from nb_log_file_handler import NbLogFileHandler

logger = logging.getLogger('hello')

fh = NbLogFileHandler(file_name='xx3.log',log_path='./',max_bytes=1000*1000,back_count=3)

logger.addHandler(fh)
# logger.addHandler(logging.StreamHandler())
logger.setLevel(logging.DEBUG)



def f():
    for i in range(10000):
        logger.info(f'{i}aaaaa'*20)

2、各种按文件/时间大小切割的fileHander对比,

为了测试多进程按文件大小切割安全的复现,所以所有maxBytes按照1000*1000字节,即1M进行切割。

2.1、对比logging内置的 logging.handlers.RotatingFileHandler

logging.handlers.RotatingFileHandler 多进程按大小切割完全不可行,切割时候疯狂报错

import multiprocessing
import logging.handlers
import time

logger = logging.getLogger('hello')

fh = logging.handlers.RotatingFileHandler('xx4.log',maxBytes=1000*1000,backupCount=3)

logger.addHandler(fh)
# logger.addHandler(logging.StreamHandler())
logger.setLevel(logging.DEBUG)



def f():
    for i in range(10000):
        logger.info(f'{i}aaaaa'*20)

if __name__ == '__main__':
    t1 = time.time()
    ps = []
    for  j in range(10):
        p = multiprocessing.Process(target=f)
        ps.append(p)
        p.start()
    for p in ps:
        p.join()
    print(time.time()-t1)

这个代码使用 文件handler选择原生自带的 logging.handlers.RotatingFileHandler 会疯狂报错,因为进程a在达到大小切割改名日志文件时候,进程b并不知情,报错如下:

PermissionError: [WinError 32] 另一个程序正在使用此文件,进程无法访问。: 'D:\\codes\\nb_log_file_handler\\tests_nb_log_file_handler\\xx4.log' -> 'D:\\codes\\nb_log_file_handler\\tests_nb_log_file_handler\\xx4.log.1'

所以一般多进程写入同一个日志文件,并支持切割,那么久不能使用logging自带的RotatingFileHandler,要使用第三方包的filehandler。

2.2、对比小有名气的多进程切割安全的三方包 concurrent_log_handler

concurrent_log_handler.ConcurrentRotatingFileHandler

import multiprocessing
import logging
import time
from concurrent_log_handler import ConcurrentRotatingFileHandler

logger = logging.getLogger('hello')

fh = ConcurrentRotatingFileHandler('xx2.log',maxBytes=1000*1000,backupCount=3)

logger.addHandler(fh)
# logger.addHandler(logging.StreamHandler())
logger.setLevel(logging.DEBUG)



def f():
    for i in range(10000):
        logger.info(f'{i}aaaaa'*20)

if __name__ == '__main__':
    t1 = time.time()
    ps = []
    for  j in range(10):
        p = multiprocessing.Process(target=f)
        ps.append(p)
        p.start()
    for p in ps:
        p.join()
    print(time.time()-t1)

concurrent_log_handler这个包在windows上性能无法忍受,10进程写入10000次需要263秒,性能惨不忍睹。这个包在linux上性能还可以接受。

2.3、 nb_log_file_handler.NbLogFileHandler 按时间和大小多进程安全切割,性能远远的暴击 concurrent_log_handler

import multiprocessing
import logging
import time
from nb_log_file_handler import NbLogFileHandler

logger = logging.getLogger('hello')

fh = NbLogFileHandler(file_name='xx3.log',log_path='./',max_bytes=1000*1000,back_count=3)

logger.addHandler(fh)
# logger.addHandler(logging.StreamHandler())
logger.setLevel(logging.DEBUG)



def f():
    for i in range(10000):
        logger.info(f'{i}aaaaa'*20)

if __name__ == '__main__':
    t1 = time.time()
    ps = []
    for  j in range(10):
        p = multiprocessing.Process(target=f)
        ps.append(p)
        p.start()
    for p in ps:
        p.join()
    print(time.time()-t1)

nb_log_file_handler.NbLogFileHandler 10进程写入10000次只需要1.3秒,nb_log_file_handler 性能远远的暴击三方包 concurrent_log_handler

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nb_log_file_handler-0.2.tar.gz (5.2 kB view details)

Uploaded Source

File details

Details for the file nb_log_file_handler-0.2.tar.gz.

File metadata

  • Download URL: nb_log_file_handler-0.2.tar.gz
  • Upload date:
  • Size: 5.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.12

File hashes

Hashes for nb_log_file_handler-0.2.tar.gz
Algorithm Hash digest
SHA256 9b6a0a2a06b821e8500e485ac34d2ed53171108ef0eb8a43a4c7f86c03591cd8
MD5 363b912c84aeabda00075a423201f02f
BLAKE2b-256 2c93ee7ab21e97415d663809c5e1ef7811516e6ab3cf3728c77ecf219f420d0f

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page