Skip to main content

Utilities for logging and sending logs.

Project description

liblogging

Utilities for logging and sending logs.

pip install liblogging

🌟Feature

统一日志格式记录

统一了当前agent的日志记录格式,也可自己基于默认格式进行拓展。 当前记录的信息和对应的key如下:

{
    "create_time": "时间戳,默认和mysql列datatime(3)保持一致",
    "level": "like INFO, ERROR, WARNING",
    # 通过上下文变量保存trace_id
    "trace_id": "trace_id for 追溯不同服务的调用链路",
    "line_info": "{record.filename}:{record.lineno}:{record.funcName}",
    "message": message,
    # 通过上下文变量区分不同源, 方便接收不同服务源信息, 比如Chat, Welcome, Planning等
    "message_source": context.get("message_source", "chat_log"),
    # 控制不同log类型,便于筛选日志数据, 比如tool, llm, turn等
    "message_type": message_type,
    # 可以根据自己需求加入其他的字段
    **extra_message
}

上述日志信息均以json字符串的形式记录下来,方便存储及后续处理。

配置上下文变量,无须重复传参显示记录

通过装饰器形式, 指定需要配置的全局上下文变量, 仅需在整个程序/服务入口配置一次即可。

需要注意的是配置的全局上下文变量,根据加入装饰器下的函数入参名称匹配进行更新,推荐函数参数定义使用BaseModel

主程序/服务: service1.py
from pydantic import BaseModel

from liblogging import log_request,logger


class Request(BaseModel):
    name: str
    trace_id: str

#在主程序入口配置了trace_id这一全局上下文变量,会通过函数入参对该字段进行赋值,后续在该服务下的其他程序logger.info时会读取这一变量并记录下来。
#同时也支持默认参数配置,比如message_source设置了默认值,后续使用logger会记录message_source为"demo"。
@log_request("trace_id", message_source="demo")
def your_service_entry(request: Request):
    logger.info("Processing request")
该服务下的其他程序: function1.py可直接logger.info(). trace_id, message_source均会记录下来
from liblogging import logger

def test(name):
    logger.info(f"Testing {name}")

重定向并发送到消息队列

以默认集成的kafka为例,可将上述统一日志格式记录的形式发送至kafka。

kafka 配置文件格式:

{
    "{cluster_name}": {
        "{env_name}": {
            "bootstrap_servers": "server1, server2, server3",
            "username": "username",
            "password": "******",
            "topic": "your topic",
            "...": "..."
        }
    }
}

使用形式:

python service 2>&1 | tee {log_file_path} | liblogging_collector --config-path {your_kafka_path}  --ssl-cafile {your_ssl_cafile_path} --send-kafka

tee {log_file_path} 可以将你的程序记录(输出+错误)重定向到文件中(可选)。

log_collector.pyliblogging_collector的源代码地址。

env_name不指定的话,默认读取os.environ.get("CHAT_ENV", "dev").

📋Example

增加额外记录字段信息,以及搭配libentry使用的样例见 example

💡Tips

  1. If using Kafka to send messages, please use pip install liblogging[collector].

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

liblogging-0.1.11.tar.gz (14.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

liblogging-0.1.11-py3-none-any.whl (14.8 kB view details)

Uploaded Python 3

File details

Details for the file liblogging-0.1.11.tar.gz.

File metadata

  • Download URL: liblogging-0.1.11.tar.gz
  • Upload date:
  • Size: 14.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.11

File hashes

Hashes for liblogging-0.1.11.tar.gz
Algorithm Hash digest
SHA256 ebdee1c616c336ad2fa73cf086a00aec6db56b02de7357d490d591bb0b6deac4
MD5 d162357db26d546f28bacc131519dbda
BLAKE2b-256 8b885ddf571ce653660a4ba4574eceea2a4a42277af1541260d382169841ed42

See more details on using hashes here.

File details

Details for the file liblogging-0.1.11-py3-none-any.whl.

File metadata

  • Download URL: liblogging-0.1.11-py3-none-any.whl
  • Upload date:
  • Size: 14.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.11

File hashes

Hashes for liblogging-0.1.11-py3-none-any.whl
Algorithm Hash digest
SHA256 c762bef36ff0447dff9f894f91865fc979d64c161557bd6293c33a7c164bf4e6
MD5 21ed92ffebf0c3df9c0b43d0aa46d46f
BLAKE2b-256 2cceb540bee784d295c6f9fb744693972454c1d4ac7a39d808aa36650f178057

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page