Skip to main content

logging module for x17 ecosystem

Project description

xlog

A flexible and structured logging library for Python with support for multiple outputs, formats, and async processing.

Features

  • Structured Events - Log with context, tags, metrics, and metadata
  • Multiple Formats - JSON, Text, ColorJSON, ColorText
  • Multiple Outputs - Console, files, custom sinks
  • Async Processing - Non-blocking event handling
  • Type Safe - Full protocol support for extensibility

Installation

pip install -e .

Quick Start

from xlog import LogStream, FileGroup

# Simple console logging
stream = LogStream(name="app", level="INFO")
stream.log("Application started")

# Log with context
stream.log(
    "User login",
    level="INFO",
    context={"user_id": "123", "ip": "192.168.1.1"}
)

# Add file output
file_group = FileGroup(path="./logs", name="app", async_=True)
stream.add_group(file_group)
stream.log("This goes to console and file")
file_group.close()

Core Components

Events

Create structured log events:

from xlog import Log, Procs

# Log event
event = Log(
    message="Request processed",
    level="INFO",
    code=200,
    context={"user": "alice"},
    metrics={"duration_ms": "45"}
)

# Subprocess event
import subprocess
result = subprocess.run(["echo", "hello"], capture_output=True)
proc_event = Procs(proc=result)

Streams

Distribute events to multiple destinations:

from xlog import LogStream, LogGroup, FileGroup

# Create stream with multiple outputs
memory_group = LogGroup(name="memory", store=True)
file_group = FileGroup(path="./logs", name="app")

stream = LogStream(
    name="app",
    level="INFO",
    groups=[memory_group, file_group]
)

Formatters

Choose output format:

from xlog import Json, Text, ColorJson, ColorText, LogStream

# JSON format
stream = LogStream(name="app", format=Json(indent=2))

# Colored text for console
stream = LogStream(name="app", format=ColorText())

Groups (Sinks)

Process and store events:

from xlog import LogGroup, FileGroup

# In-memory storage
group = LogGroup(name="memory", store=True, async_=False)

# File storage with async processing
group = FileGroup(
    path="./logs",
    name="app",
    async_=True,
    max_queue=1000
)

Real-World Examples

Web Application Logging

from xlog import LogStream, FileGroup, ColorText

access_log = FileGroup(path="./logs", name="access", async_=True)
error_log = FileGroup(path="./logs", name="error", async_=True)

access_stream = LogStream(
    name="access",
    level="INFO",
    format=ColorText(),
    groups=[access_log]
)

access_stream.log(
    "GET /api/users 200",
    level="INFO",
    code=200,
    context={"user_id": "123", "ip": "192.168.1.1"},
    metrics={"response_time_ms": "45"}
)

Data Pipeline Logging

from xlog import LogStream, FileGroup

pipeline = LogStream(
    name="etl-pipeline",
    groups=[FileGroup(path="./logs", name="pipeline")]
)

pipeline.log("Extract started", context={"source": "database"})
pipeline.log("Transform completed", metrics={"records": "1000"})
pipeline.log("Load finished", context={"target": "warehouse"})

Microservice Tracing

from xlog import LogStream
import uuid

stream = LogStream(name="order-service", level="INFO")

correlation_id = str(uuid.uuid4())

stream.log(
    "Processing order",
    context={
        "correlation_id": correlation_id,
        "order_id": "ORD-123"
    }
)

stream.log(
    "Calling payment-service",
    context={
        "correlation_id": correlation_id,
        "target_service": "payment-service"
    }
)

More Examples

See the examples/ directory for complete working examples:

  • example_basic_logging.py - Getting started
  • example_file_logging.py - File output
  • example_formatters.py - All formatters
  • example_web_application.py - HTTP logging
  • example_data_pipeline.py - ETL pipeline
  • example_microservice.py - Distributed tracing
  • example_monitoring.py - System monitoring

Documentation

Each module includes comprehensive docstrings:

from xlog import LogStream
help(LogStream)

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lib_x17_log-1.0.0.tar.gz (36.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

lib_x17_log-1.0.0-py3-none-any.whl (52.4 kB view details)

Uploaded Python 3

File details

Details for the file lib_x17_log-1.0.0.tar.gz.

File metadata

  • Download URL: lib_x17_log-1.0.0.tar.gz
  • Upload date:
  • Size: 36.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.14

File hashes

Hashes for lib_x17_log-1.0.0.tar.gz
Algorithm Hash digest
SHA256 3a0748ee9989563f4f3f23981b25f066acc654e3d40fb61243ee44c65829cc86
MD5 5566709d223636ee55afdd22891e5c91
BLAKE2b-256 7c7e0e22e7869726a1965623a2caa26543e10591d1e47fb9232a3b26d1e8abdb

See more details on using hashes here.

File details

Details for the file lib_x17_log-1.0.0-py3-none-any.whl.

File metadata

  • Download URL: lib_x17_log-1.0.0-py3-none-any.whl
  • Upload date:
  • Size: 52.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.14

File hashes

Hashes for lib_x17_log-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 3e08682d4ef02b788130f01402edcec1e24bf39baab36266e6ba146de02c867f
MD5 95ed4dce4a596bb0bb8b23d4141609ee
BLAKE2b-256 9675a1c86a2d29a258a62d967f1087a8a1efffb2673dc46a1f1879e42ad33e3a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page