Skip to main content

logging module for x17 ecosystem

Project description

xlog

A flexible and structured logging library for Python with support for multiple outputs, formats, and async processing.

Features

  • Structured Events - Log with context, tags, metrics, and metadata
  • Multiple Formats - JSON, Text, ColorJSON, ColorText
  • Multiple Outputs - Console, files, custom sinks
  • Async Processing - Non-blocking event handling
  • Type Safe - Full protocol support for extensibility

Installation

pip install -e .

Quick Start

from xlog import LogStream, FileGroup

# Simple console logging
stream = LogStream(name="app", level="INFO")
stream.log("Application started")

# Log with context
stream.log(
    "User login",
    level="INFO",
    context={"user_id": "123", "ip": "192.168.1.1"}
)

# Add file output
file_group = FileGroup(path="./logs", name="app", async_=True)
stream.add_group(file_group)
stream.log("This goes to console and file")
file_group.close()

Core Components

Events

Create structured log events:

from xlog import Log, Procs

# Log event
event = Log(
    message="Request processed",
    level="INFO",
    code=200,
    context={"user": "alice"},
    metrics={"duration_ms": "45"}
)

# Subprocess event
import subprocess
result = subprocess.run(["echo", "hello"], capture_output=True)
proc_event = Procs(proc=result)

Streams

Distribute events to multiple destinations:

from xlog import LogStream, LogGroup, FileGroup

# Create stream with multiple outputs
memory_group = LogGroup(name="memory", store=True)
file_group = FileGroup(path="./logs", name="app")

stream = LogStream(
    name="app",
    level="INFO",
    groups=[memory_group, file_group]
)

Formatters

Choose output format:

from xlog import Json, Text, ColorJson, ColorText, LogStream

# JSON format
stream = LogStream(name="app", format=Json(indent=2))

# Colored text for console
stream = LogStream(name="app", format=ColorText())

Groups (Sinks)

Process and store events:

from xlog import LogGroup, FileGroup

# In-memory storage
group = LogGroup(name="memory", store=True, async_=False)

# File storage with async processing
group = FileGroup(
    path="./logs",
    name="app",
    async_=True,
    max_queue=1000
)

Real-World Examples

Web Application Logging

from xlog import LogStream, FileGroup, ColorText

access_log = FileGroup(path="./logs", name="access", async_=True)
error_log = FileGroup(path="./logs", name="error", async_=True)

access_stream = LogStream(
    name="access",
    level="INFO",
    format=ColorText(),
    groups=[access_log]
)

access_stream.log(
    "GET /api/users 200",
    level="INFO",
    code=200,
    context={"user_id": "123", "ip": "192.168.1.1"},
    metrics={"response_time_ms": "45"}
)

Data Pipeline Logging

from xlog import LogStream, FileGroup

pipeline = LogStream(
    name="etl-pipeline",
    groups=[FileGroup(path="./logs", name="pipeline")]
)

pipeline.log("Extract started", context={"source": "database"})
pipeline.log("Transform completed", metrics={"records": "1000"})
pipeline.log("Load finished", context={"target": "warehouse"})

Microservice Tracing

from xlog import LogStream
import uuid

stream = LogStream(name="order-service", level="INFO")

correlation_id = str(uuid.uuid4())

stream.log(
    "Processing order",
    context={
        "correlation_id": correlation_id,
        "order_id": "ORD-123"
    }
)

stream.log(
    "Calling payment-service",
    context={
        "correlation_id": correlation_id,
        "target_service": "payment-service"
    }
)

More Examples

See the examples/ directory for complete working examples:

  • example_basic_logging.py - Getting started
  • example_file_logging.py - File output
  • example_formatters.py - All formatters
  • example_web_application.py - HTTP logging
  • example_data_pipeline.py - ETL pipeline
  • example_microservice.py - Distributed tracing
  • example_monitoring.py - System monitoring

Documentation

Each module includes comprehensive docstrings:

from xlog import LogStream
help(LogStream)

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lib_x17_log-1.0.6.tar.gz (50.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

lib_x17_log-1.0.6-py3-none-any.whl (70.2 kB view details)

Uploaded Python 3

File details

Details for the file lib_x17_log-1.0.6.tar.gz.

File metadata

  • Download URL: lib_x17_log-1.0.6.tar.gz
  • Upload date:
  • Size: 50.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.14

File hashes

Hashes for lib_x17_log-1.0.6.tar.gz
Algorithm Hash digest
SHA256 5c1112ef9a374d81d8b89eb21e4f09948df83a0e1bf2ab49bb4500fce87993a8
MD5 8dc21c2ee298c2bc87b2818145949213
BLAKE2b-256 a0597b7f4062712a0cf3e46862ccaf560275edde6c9c0406e263a2e27567f8e7

See more details on using hashes here.

File details

Details for the file lib_x17_log-1.0.6-py3-none-any.whl.

File metadata

  • Download URL: lib_x17_log-1.0.6-py3-none-any.whl
  • Upload date:
  • Size: 70.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.14

File hashes

Hashes for lib_x17_log-1.0.6-py3-none-any.whl
Algorithm Hash digest
SHA256 667b90c04933cd789527cfc6d594bf6f51afcf49107e384253a13771a8703ec7
MD5 553fa5049df8bbc7c4e67580c73ce6a7
BLAKE2b-256 9317edc0c4720cc929d9b051fff279c90c0cdd160375c716b7130c6e958f3078

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page