Skip to main content

Utilities for logging and sending logs.

Project description

liblogging

Utilities for logging and sending logs.

pip install liblogging

🌟Feature

统一日志格式记录

统一了当前agent的日志记录格式,也可自己基于默认格式进行拓展。 当前记录的信息和对应的key如下:

{
    "create_time": "时间戳,默认和mysql列datatime(3)保持一致",
    "level": "like INFO, ERROR, WARNING",
    # 通过上下文变量保存trace_id
    "trace_id": "trace_id for 追溯不同服务的调用链路",
    "line_info": "{record.filename}:{record.lineno}:{record.funcName}",
    "message": message,
    # 通过上下文变量区分不同源, 方便接收不同服务源信息, 比如Chat, Welcome, Planning等
    "message_source": context.get("message_source", "chat_log"),
    # 控制不同log类型,便于筛选日志数据, 比如tool, llm, turn等
    "message_type": message_type,
    # 可以根据自己需求加入其他的字段
    **extra_message
}

上述日志信息均以json字符串的形式记录下来,方便存储及后续处理。

配置上下文变量,无须重复传参显示记录

通过装饰器形式, 指定需要配置的全局上下文变量, 仅需在整个程序/服务入口配置一次即可。

需要注意的是配置的全局上下文变量,根据加入装饰器下的函数入参名称匹配进行更新,推荐函数参数定义使用BaseModel

主程序/服务: service1.py
from pydantic import BaseModel

from liblogging import log_request,logger


class Request(BaseModel):
    name: str
    trace_id: str

#在主程序入口配置了trace_id这一全局上下文变量,会通过函数入参对该字段进行赋值,后续在该服务下的其他程序logger.info时会读取这一变量并记录下来。
#同时也支持默认参数配置,比如message_source设置了默认值,后续使用logger会记录message_source为"demo"。
@log_request("trace_id", message_source="demo")
def your_service_entry(request: Request):
    logger.info("Processing request")
该服务下的其他程序: function1.py可直接logger.info(). trace_id, message_source均会记录下来
from liblogging import logger

def test(name):
    logger.info(f"Testing {name}")

重定向并发送到消息队列

以默认集成的kafka为例,可将上述统一日志格式记录的形式发送至kafka。

kafka 配置文件格式:

{
    "{cluster_name}": {
        "{env_name}": {
            "bootstrap_servers": "server1, server2, server3",
            "username": "username",
            "password": "******",
            "topic": "your topic",
            "...": "..."
        }
    }
}

使用形式:

python service 2>&1 | tee {log_file_path} | liblogging_collector --config-path {your_kafka_path}  --ssl-cafile {your_ssl_cafile_path} --send-kafka

tee {log_file_path} 可以将你的程序记录(输出+错误)重定向到文件中(可选)。

log_collector.pyliblogging_collector的源代码地址。

env_name不指定的话,默认读取os.environ.get("CHAT_ENV", "dev").

📋Example

增加额外记录字段信息,以及搭配libentry使用的样例见 example

💡Tips

  1. If using Kafka to send messages, please use pip install liblogging[collector].
  2. 如果需要数据持久化,推荐日志消息都写在message列中,维护一列节省内存空间。需要后续进行查询的,以字典形式记录,比如logger.info({"key": "value"}), 便于后续查找。

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

liblogging-0.1.13.tar.gz (15.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

liblogging-0.1.13-py3-none-any.whl (15.0 kB view details)

Uploaded Python 3

File details

Details for the file liblogging-0.1.13.tar.gz.

File metadata

  • Download URL: liblogging-0.1.13.tar.gz
  • Upload date:
  • Size: 15.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.11

File hashes

Hashes for liblogging-0.1.13.tar.gz
Algorithm Hash digest
SHA256 557788cbe3ba2dc8ba92341f8bdce24d5bbbf003f47f6532c408ef38169277b8
MD5 acddd038a42fdc4f3e53319f4859a3b3
BLAKE2b-256 bce834a960cbaacf54c7f01371ad8b24f3f5d5843ef45f8a02c565943f392d66

See more details on using hashes here.

File details

Details for the file liblogging-0.1.13-py3-none-any.whl.

File metadata

  • Download URL: liblogging-0.1.13-py3-none-any.whl
  • Upload date:
  • Size: 15.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.11

File hashes

Hashes for liblogging-0.1.13-py3-none-any.whl
Algorithm Hash digest
SHA256 28329cca90d87d3b0b20f1125b92677738d9e2fc68a25494e6d3be6974babc52
MD5 da43649e99a595bf3886f81809104cea
BLAKE2b-256 21c3eb7d1db6fd7eb81674521c7d92f0d8404b01a1a555e64afcab90096b83dc

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page