Skip to main content

logging module for x17 ecosystem

Project description

xlog

A flexible and structured logging library for Python with support for multiple outputs, formats, and async processing.

Features

  • Structured Events - Log with context, tags, metrics, and metadata
  • Multiple Formats - JSON, Text, ColorJSON, ColorText
  • Multiple Outputs - Console, files, custom sinks
  • Async Processing - Non-blocking event handling
  • Type Safe - Full protocol support for extensibility

Installation

pip install -e .

Quick Start

from xlog import LogStream, FileGroup

# Simple console logging
stream = LogStream(name="app", level="INFO")
stream.log("Application started")

# Log with context
stream.log(
    "User login",
    level="INFO",
    context={"user_id": "123", "ip": "192.168.1.1"}
)

# Add file output
file_group = FileGroup(path="./logs", name="app", async_=True)
stream.add_group(file_group)
stream.log("This goes to console and file")
file_group.close()

Core Components

Events

Create structured log events:

from xlog import Log, Procs

# Log event
event = Log(
    message="Request processed",
    level="INFO",
    code=200,
    context={"user": "alice"},
    metrics={"duration_ms": "45"}
)

# Subprocess event
import subprocess
result = subprocess.run(["echo", "hello"], capture_output=True)
proc_event = Procs(proc=result)

Streams

Distribute events to multiple destinations:

from xlog import LogStream, LogGroup, FileGroup

# Create stream with multiple outputs
memory_group = LogGroup(name="memory", store=True)
file_group = FileGroup(path="./logs", name="app")

stream = LogStream(
    name="app",
    level="INFO",
    groups=[memory_group, file_group]
)

Formatters

Choose output format:

from xlog import Json, Text, ColorJson, ColorText, LogStream

# JSON format
stream = LogStream(name="app", format=Json(indent=2))

# Colored text for console
stream = LogStream(name="app", format=ColorText())

Groups (Sinks)

Process and store events:

from xlog import LogGroup, FileGroup

# In-memory storage
group = LogGroup(name="memory", store=True, async_=False)

# File storage with async processing
group = FileGroup(
    path="./logs",
    name="app",
    async_=True,
    max_queue=1000
)

Real-World Examples

Web Application Logging

from xlog import LogStream, FileGroup, ColorText

access_log = FileGroup(path="./logs", name="access", async_=True)
error_log = FileGroup(path="./logs", name="error", async_=True)

access_stream = LogStream(
    name="access",
    level="INFO",
    format=ColorText(),
    groups=[access_log]
)

access_stream.log(
    "GET /api/users 200",
    level="INFO",
    code=200,
    context={"user_id": "123", "ip": "192.168.1.1"},
    metrics={"response_time_ms": "45"}
)

Data Pipeline Logging

from xlog import LogStream, FileGroup

pipeline = LogStream(
    name="etl-pipeline",
    groups=[FileGroup(path="./logs", name="pipeline")]
)

pipeline.log("Extract started", context={"source": "database"})
pipeline.log("Transform completed", metrics={"records": "1000"})
pipeline.log("Load finished", context={"target": "warehouse"})

Microservice Tracing

from xlog import LogStream
import uuid

stream = LogStream(name="order-service", level="INFO")

correlation_id = str(uuid.uuid4())

stream.log(
    "Processing order",
    context={
        "correlation_id": correlation_id,
        "order_id": "ORD-123"
    }
)

stream.log(
    "Calling payment-service",
    context={
        "correlation_id": correlation_id,
        "target_service": "payment-service"
    }
)

More Examples

See the examples/ directory for complete working examples:

  • example_basic_logging.py - Getting started
  • example_file_logging.py - File output
  • example_formatters.py - All formatters
  • example_web_application.py - HTTP logging
  • example_data_pipeline.py - ETL pipeline
  • example_microservice.py - Distributed tracing
  • example_monitoring.py - System monitoring

Documentation

Each module includes comprehensive docstrings:

from xlog import LogStream
help(LogStream)

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lib_x17_log-1.0.5.tar.gz (50.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

lib_x17_log-1.0.5-py3-none-any.whl (70.1 kB view details)

Uploaded Python 3

File details

Details for the file lib_x17_log-1.0.5.tar.gz.

File metadata

  • Download URL: lib_x17_log-1.0.5.tar.gz
  • Upload date:
  • Size: 50.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.14

File hashes

Hashes for lib_x17_log-1.0.5.tar.gz
Algorithm Hash digest
SHA256 25aa50e5141e9cf26abe5337705605de8b6ebbf99afb994381d4ac8a9b483325
MD5 ac93273017e32b46db87c26a27ade271
BLAKE2b-256 662b79b8ed67be924ea4274237eb0fa95ab5ed8b69c177d0facc47de2fa5ce59

See more details on using hashes here.

File details

Details for the file lib_x17_log-1.0.5-py3-none-any.whl.

File metadata

  • Download URL: lib_x17_log-1.0.5-py3-none-any.whl
  • Upload date:
  • Size: 70.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.14

File hashes

Hashes for lib_x17_log-1.0.5-py3-none-any.whl
Algorithm Hash digest
SHA256 3bed92c070cf7e9d4bf0b9c0372de559caa0548a4b89765ee2c0c2088aa2fdbf
MD5 ba983fa9d354d349593a1c795aa18426
BLAKE2b-256 12213d2c43f4d10a04a7ef8f5ed5beda2ba109d16d0640d8e4d3a6991039905b

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page