Skip to main content

logging module for x17 ecosystem

Project description

xlog

A flexible and structured logging library for Python with support for multiple outputs, formats, and async processing.

Features

  • Structured Events - Log with context, tags, metrics, and metadata
  • Multiple Formats - JSON, Text, ColorJSON, ColorText
  • Multiple Outputs - Console, files, custom sinks
  • Async Processing - Non-blocking event handling
  • Type Safe - Full protocol support for extensibility

Installation

pip install -e .

Quick Start

from xlog import LogStream, FileGroup

# Simple console logging
stream = LogStream(name="app", level="INFO")
stream.log("Application started")

# Log with context
stream.log(
    "User login",
    level="INFO",
    context={"user_id": "123", "ip": "192.168.1.1"}
)

# Add file output
file_group = FileGroup(path="./logs", name="app", async_=True)
stream.add_group(file_group)
stream.log("This goes to console and file")
file_group.close()

Core Components

Events

Create structured log events:

from xlog import Log, Procs

# Log event
event = Log(
    message="Request processed",
    level="INFO",
    code=200,
    context={"user": "alice"},
    metrics={"duration_ms": "45"}
)

# Subprocess event
import subprocess
result = subprocess.run(["echo", "hello"], capture_output=True)
proc_event = Procs(proc=result)

Streams

Distribute events to multiple destinations:

from xlog import LogStream, LogGroup, FileGroup

# Create stream with multiple outputs
memory_group = LogGroup(name="memory", store=True)
file_group = FileGroup(path="./logs", name="app")

stream = LogStream(
    name="app",
    level="INFO",
    groups=[memory_group, file_group]
)

Formatters

Choose output format:

from xlog import Json, Text, ColorJson, ColorText, LogStream

# JSON format
stream = LogStream(name="app", format=Json(indent=2))

# Colored text for console
stream = LogStream(name="app", format=ColorText())

Groups (Sinks)

Process and store events:

from xlog import LogGroup, FileGroup

# In-memory storage
group = LogGroup(name="memory", store=True, async_=False)

# File storage with async processing
group = FileGroup(
    path="./logs",
    name="app",
    async_=True,
    max_queue=1000
)

Real-World Examples

Web Application Logging

from xlog import LogStream, FileGroup, ColorText

access_log = FileGroup(path="./logs", name="access", async_=True)
error_log = FileGroup(path="./logs", name="error", async_=True)

access_stream = LogStream(
    name="access",
    level="INFO",
    format=ColorText(),
    groups=[access_log]
)

access_stream.log(
    "GET /api/users 200",
    level="INFO",
    code=200,
    context={"user_id": "123", "ip": "192.168.1.1"},
    metrics={"response_time_ms": "45"}
)

Data Pipeline Logging

from xlog import LogStream, FileGroup

pipeline = LogStream(
    name="etl-pipeline",
    groups=[FileGroup(path="./logs", name="pipeline")]
)

pipeline.log("Extract started", context={"source": "database"})
pipeline.log("Transform completed", metrics={"records": "1000"})
pipeline.log("Load finished", context={"target": "warehouse"})

Microservice Tracing

from xlog import LogStream
import uuid

stream = LogStream(name="order-service", level="INFO")

correlation_id = str(uuid.uuid4())

stream.log(
    "Processing order",
    context={
        "correlation_id": correlation_id,
        "order_id": "ORD-123"
    }
)

stream.log(
    "Calling payment-service",
    context={
        "correlation_id": correlation_id,
        "target_service": "payment-service"
    }
)

More Examples

See the examples/ directory for complete working examples:

  • example_basic_logging.py - Getting started
  • example_file_logging.py - File output
  • example_formatters.py - All formatters
  • example_web_application.py - HTTP logging
  • example_data_pipeline.py - ETL pipeline
  • example_microservice.py - Distributed tracing
  • example_monitoring.py - System monitoring

Documentation

Each module includes comprehensive docstrings:

from xlog import LogStream
help(LogStream)

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lib_x17_log-1.0.4.tar.gz (48.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

lib_x17_log-1.0.4-py3-none-any.whl (68.4 kB view details)

Uploaded Python 3

File details

Details for the file lib_x17_log-1.0.4.tar.gz.

File metadata

  • Download URL: lib_x17_log-1.0.4.tar.gz
  • Upload date:
  • Size: 48.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.14

File hashes

Hashes for lib_x17_log-1.0.4.tar.gz
Algorithm Hash digest
SHA256 6f429ea88d7ab347c836c8e50cfa43f7cb1a4400af943723ad64ab3dd47a8bd7
MD5 6d60031d78ee12c1cc66fb9ac9e3b884
BLAKE2b-256 c385861637199ddc51f545fef28088f23a745d3050a913546b882b8aa68b58bd

See more details on using hashes here.

File details

Details for the file lib_x17_log-1.0.4-py3-none-any.whl.

File metadata

  • Download URL: lib_x17_log-1.0.4-py3-none-any.whl
  • Upload date:
  • Size: 68.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.14

File hashes

Hashes for lib_x17_log-1.0.4-py3-none-any.whl
Algorithm Hash digest
SHA256 8396c353355090088d7d12f3b8d303893f7371954d090c90ad9a3417c008342d
MD5 21cfc309afbbf265e6627c91bac8ce8a
BLAKE2b-256 2e43d5b8c8b4b402eff289b6d9217146f8050a106945ec025e845525feff2843

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page