Skip to main content

logging module for x17 ecosystem

Project description

xlog

A flexible and structured logging library for Python with support for multiple outputs, formats, and async processing.

Features

  • Structured Events - Log with context, tags, metrics, and metadata
  • Multiple Formats - JSON, Text, ColorJSON, ColorText
  • Multiple Outputs - Console, files, custom sinks
  • Async Processing - Non-blocking event handling
  • Type Safe - Full protocol support for extensibility

Installation

pip install -e .

Quick Start

from xlog import LogStream, FileGroup

# Simple console logging
stream = LogStream(name="app", level="INFO")
stream.log("Application started")

# Log with context
stream.log(
    "User login",
    level="INFO",
    context={"user_id": "123", "ip": "192.168.1.1"}
)

# Add file output
file_group = FileGroup(path="./logs", name="app", async_=True)
stream.add_group(file_group)
stream.log("This goes to console and file")
file_group.close()

Core Components

Events

Create structured log events:

from xlog import Log, Procs

# Log event
event = Log(
    message="Request processed",
    level="INFO",
    code=200,
    context={"user": "alice"},
    metrics={"duration_ms": "45"}
)

# Subprocess event
import subprocess
result = subprocess.run(["echo", "hello"], capture_output=True)
proc_event = Procs(proc=result)

Streams

Distribute events to multiple destinations:

from xlog import LogStream, LogGroup, FileGroup

# Create stream with multiple outputs
memory_group = LogGroup(name="memory", store=True)
file_group = FileGroup(path="./logs", name="app")

stream = LogStream(
    name="app",
    level="INFO",
    groups=[memory_group, file_group]
)

Formatters

Choose output format:

from xlog import Json, Text, ColorJson, ColorText, LogStream

# JSON format
stream = LogStream(name="app", format=Json(indent=2))

# Colored text for console
stream = LogStream(name="app", format=ColorText())

Groups (Sinks)

Process and store events:

from xlog import LogGroup, FileGroup

# In-memory storage
group = LogGroup(name="memory", store=True, async_=False)

# File storage with async processing
group = FileGroup(
    path="./logs",
    name="app",
    async_=True,
    max_queue=1000
)

Real-World Examples

Web Application Logging

from xlog import LogStream, FileGroup, ColorText

access_log = FileGroup(path="./logs", name="access", async_=True)
error_log = FileGroup(path="./logs", name="error", async_=True)

access_stream = LogStream(
    name="access",
    level="INFO",
    format=ColorText(),
    groups=[access_log]
)

access_stream.log(
    "GET /api/users 200",
    level="INFO",
    code=200,
    context={"user_id": "123", "ip": "192.168.1.1"},
    metrics={"response_time_ms": "45"}
)

Data Pipeline Logging

from xlog import LogStream, FileGroup

pipeline = LogStream(
    name="etl-pipeline",
    groups=[FileGroup(path="./logs", name="pipeline")]
)

pipeline.log("Extract started", context={"source": "database"})
pipeline.log("Transform completed", metrics={"records": "1000"})
pipeline.log("Load finished", context={"target": "warehouse"})

Microservice Tracing

from xlog import LogStream
import uuid

stream = LogStream(name="order-service", level="INFO")

correlation_id = str(uuid.uuid4())

stream.log(
    "Processing order",
    context={
        "correlation_id": correlation_id,
        "order_id": "ORD-123"
    }
)

stream.log(
    "Calling payment-service",
    context={
        "correlation_id": correlation_id,
        "target_service": "payment-service"
    }
)

More Examples

See the examples/ directory for complete working examples:

  • example_basic_logging.py - Getting started
  • example_file_logging.py - File output
  • example_formatters.py - All formatters
  • example_web_application.py - HTTP logging
  • example_data_pipeline.py - ETL pipeline
  • example_microservice.py - Distributed tracing
  • example_monitoring.py - System monitoring

Documentation

Each module includes comprehensive docstrings:

from xlog import LogStream
help(LogStream)

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lib_x17_log-1.0.3.tar.gz (48.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

lib_x17_log-1.0.3-py3-none-any.whl (68.4 kB view details)

Uploaded Python 3

File details

Details for the file lib_x17_log-1.0.3.tar.gz.

File metadata

  • Download URL: lib_x17_log-1.0.3.tar.gz
  • Upload date:
  • Size: 48.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.14

File hashes

Hashes for lib_x17_log-1.0.3.tar.gz
Algorithm Hash digest
SHA256 753740b098f46931e96f64a30baf8ef57bdaed26022c79e045e3f4328024926b
MD5 6da46b394c0326df40ae0ee7baa2b645
BLAKE2b-256 02ab57372557389c68e1a40e3888364a3f00d49a246dada832f8ab07170a6953

See more details on using hashes here.

File details

Details for the file lib_x17_log-1.0.3-py3-none-any.whl.

File metadata

  • Download URL: lib_x17_log-1.0.3-py3-none-any.whl
  • Upload date:
  • Size: 68.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.14

File hashes

Hashes for lib_x17_log-1.0.3-py3-none-any.whl
Algorithm Hash digest
SHA256 a87ee4d3a64adc69e6bc971f352b762d52949d15f714bde52c2b0bc675a0a904
MD5 dc3b4722b3bc6a128b75a159a6c564c1
BLAKE2b-256 8e460dde67c0f58c349c6b90674a45efd60aa5f18d6babd5fcde2805513e3fa8

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page