Skip to main content

dlogger by drawiks

Project description

📝 dlogger

PyPI version Python 3.8+ License MIT Status

dlogger — simple logger for personal projects

(─‿‿─)

     ____    __
    / __ \  / /   ____   ____ _ ____ _ ___   _____
   / / / / / /   / __ \ / __ `// __ `// _ \ / ___/
  / /_/ / / /___/ /_/ // /_/ // /_/ //  __// /
 /_____/ /_____/\____/ \__, / \__, / \___//_/
                      /____/ /____/

📦 installation

pip install dlogger-drawiks

📑 quick start

from dlogger import logger

logger.info("hello, world!")
logger.error("something went wrong")

with configuration:

from dlogger import logger

logger.configure(
    level="INFO",
    log_file="app.log",
    rotation="10MB",
    retention="7 days",
    compression=True
)

logger.debug("this won't be shown")
logger.info("but this will")

🧩 features

  • 🎨 TrueColor output — HEX/RGB support powered by dcolor
  • 🚀 high performance — use of buffers and call context caching
  • 🧵 thread safety — stability in multithreaded applications thanks to locks
  • 💾 write guarantee — automatic buffer reset upon correct program termination
  • 📁 smart rotation — by size (10MB, 1GB) or time (1 day, 12 hours)
  • 🗑️ auto cleanup — scheduled deletion of old files (retention="30 days")
  • 📦 compression — automatic archiving of old logs to .gz
  • 🛠️ minimal dependencies — only dcolor
  • reliability — protection from memory leaks, data loss and deadlocks
  • 🏗️ modular architecture — extensible via handlers, formatters and filters

📖 usage

log levels

logger.configure(level="INFO")  # TRACE, DEBUG, INFO, SUCCESS, WARNING, ERROR, CRITICAL

size-based rotation

logger.configure(
    log_file="app.log",
    rotation="10MB"  # or "500KB", "1GB"
)

once the file reaches 10MB → app.log.20260216_143022

time-based rotation

logger.configure(
    log_file="app.log",
    rotation="1 day"  # or "12 hours", "1 week"
)

log retention

logger.configure(
    log_file="app.log",
    retention="7 days"  # or "2 weeks", "1 month"
)

logs older than 7 days will be deleted automatically

compression

logger.configure(
    log_file="app.log",
    rotation="10MB",
    compression=True  # old logs → .gz
)

full configuration

logger.configure(
    level="INFO",              # minimum log level
    log_file="logs/app.log",   # path to log file
    show_path=True,            # show module:function:
    rotation="10MB",           # size-based rotation
    retention="7 days",        # keep logs for 7 days
    compression=True           # compress old logs
    time_format="%H:%M:%S"     # time format - 14:30:22
)

💡 examples

simple logging

from dlogger import logger

logger.info("server started on port 8000")
logger.warning("memory usage at 80%")
logger.error("failed to connect to database")

with file

from dlogger import logger

logger.configure(
    level="DEBUG",
    log_file="app.log"
)

logger.debug("starting request processing")
logger.info("request processed successfully")

for production

from dlogger import logger

logger.configure(
    level="INFO",
    log_file="logs/production.log",
    rotation="50MB",
    retention="30 days",
    compression=True
    time_format="%Y-%m-%d %H:%M:%S"
)

logger.info("application started")
logger.error("critical error in payments module")

extensibility (handlers, formatters, filters)

from dlogger import dLogger, ConsoleHandler, FileHandler, LevelFilter

# create your own logger
my_logger = dLogger()

# add handlers
my_logger.add_handler(ConsoleHandler(level="DEBUG"))
my_logger.add_handler(FileHandler("app.log", rotation="10MB"))

# or use the default logger and add/remove handlers
from dlogger import logger
logger.remove_handler(logger.handlers[0])  # remove console handler
logger.add_handler(FileHandler("debug.log", level="DEBUG"))

multiple loggers (get_logger)

from dlogger import get_logger

# like logging.getLogger()
app = get_logger("myapp")
module = get_logger("myapp.module")

# child logger inherits handlers and level from parent
app.info("message from app")
module.info("message from module")

multiple loggers (dLogger)

from dlogger import dLogger

# independent loggers for different modules
app_logger = dLogger().configure(level="INFO", log_file="app.log")
db_logger = dLogger().configure(level="DEBUG", log_file="db.log")

app_logger.info("application started")
db_logger.debug("database query executed")

filters (KeywordFilter, ModuleFilter)

from dlogger import logger, KeywordFilter, ModuleFilter, FileHandler

# exclude passwords and tokens from logs
handler = FileHandler("app.log")
handler.add_filter(KeywordFilter(exclude=["password", "token", "secret"]))
logger.add_handler(handler)

# log only specific modules
handler2 = FileHandler("debug.log")
handler2.add_filter(ModuleFilter(modules=["database:", "api:"]))
logger.add_handler(handler2)

exception logging

from dlogger import logger

# automatic - uses sys.exc_info()
try:
    result = 1 / 0
except:
    logger.exception("division by zero")

# with explicit exception
try:
    result = 1 / 0
except ZeroDivisionError as e:
    logger.exception("error", exc=e)

custom context

from dlogger import logger

# pass custom context
logger.info("message", context="my.module:function:")

# for external library integration
logger.debug("debug from library", context="library.module:handler:")

🖥️ uvicorn integration

quick way

from dlogger import logger, uvicorn_config
from uvicorn.config import Config
from uvicorn.server import Server

config = Config(
    "app:app",
    host="0.0.0.0",
    port=8000,
    log_config=uvicorn_config(logger)
)
server = Server(config=config)

from config file

Create dlogger.conf:

[loggers]
keys=root,repos,routers,utils

[logger_root]
level=DEBUG
log_file=app.log

[logger_repos]
level=INFO
log_file=repos.log
rotation=10MB
retention=7 days

[logger_routers]
level=WARNING

Then:

from dlogger import load
from uvicorn.config import Config
from uvicorn.server import Server

config = Config("app:app", log_config=load())
server = Server(config=config)

supported parameters in dlogger.conf:

  • level - log level
  • log_file - path to log file
  • rotation - rotation (10MB, 1GB, 1 day, 12 hours)
  • retention - retention (7 days, 1 month)
  • compression - compression (true/false)
  • time_format - time format

📝 log format

console:

2026-02-17 14:09:13 | INFO     | src.bot:run: - init

file:

2026-02-17 14:09:13 | INFO     | src.main:run: init
2026-02-17 14:09:13 | ERROR    | src.main:run: error

📜 license

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dlogger_drawiks-0.3.8.tar.gz (16.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

dlogger_drawiks-0.3.8-py3-none-any.whl (16.1 kB view details)

Uploaded Python 3

File details

Details for the file dlogger_drawiks-0.3.8.tar.gz.

File metadata

  • Download URL: dlogger_drawiks-0.3.8.tar.gz
  • Upload date:
  • Size: 16.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.3

File hashes

Hashes for dlogger_drawiks-0.3.8.tar.gz
Algorithm Hash digest
SHA256 af9f43b07d63360f772b03e25c61ee5e7b8d54ee2d673d6b01053ca98153e3e0
MD5 b9db447c8fc2a77cddf53905288d2b4b
BLAKE2b-256 b6c60867572b7f874e34508d9953c8f80b0bf2b25f16da492dc3ec64a6bd1c68

See more details on using hashes here.

File details

Details for the file dlogger_drawiks-0.3.8-py3-none-any.whl.

File metadata

File hashes

Hashes for dlogger_drawiks-0.3.8-py3-none-any.whl
Algorithm Hash digest
SHA256 49ce0558b7ef194823f85e4804e6ec3ec3b286fea4712065577ff84952b49a87
MD5 f820595740e14ac9a4d6f8885a917574
BLAKE2b-256 b4da1932940321f472a9d9d4e5ef904cd807eac38c1f13f7178269423aae1ae1

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page