Skip to main content

Official Python SDK for LogFlux.io -- end-to-end encrypted log ingestion

Project description

LogFlux Python SDK

Python Version License

Official Python SDK for LogFlux.io -- end-to-end encrypted log ingestion with zero-knowledge architecture.

Features

  • End-to-end encryption: AES-256-GCM + RSA key exchange -- your server never sees plaintext
  • All entry types: Logs, Metrics, Events, Traces, Audit, Telemetry
  • Multipart/mixed transport: Binary ciphertext over HTTP (no base64 overhead)
  • Background workers: Non-blocking async queue with daemon threads
  • Breadcrumbs: Automatic trail for error context
  • Distributed tracing: Span creation, child spans, header propagation
  • Sampling: Configurable sample rate (audit entries exempt)
  • BeforeSend hooks: Drop or modify entries before sending
  • Zero dependencies beyond cryptography (industry standard)

Installation

pip install logflux-sdk

Quick Start

from logflux import LogFlux, Options

# Initialize
LogFlux.init(Options(
    api_key="eu-lf_your_api_key_here",
    node="my-app",
    environment="production",
))

# Log messages
LogFlux.info("Server started", {"port": "8080"})
LogFlux.warn("High memory usage", {"percent": "92"})

# Metrics
LogFlux.counter("requests_total", 1, {"method": "GET"})
LogFlux.gauge("memory_mb", 512.0)

# Events
LogFlux.event("user.signup", {"plan": "pro"})

# Audit (never sampled -- compliance guarantee)
LogFlux.audit("delete", "admin@example.com", "user", "u-123")

# Error capture with breadcrumbs
try:
    raise ValueError("something went wrong")
except Exception as e:
    LogFlux.capture_error(e, {"request_id": "abc"})

# Cleanup
LogFlux.close()

Module-Level API

import logflux
from logflux import Options

logflux.init(Options(api_key="eu-lf_..."))
logflux.info("hello")
logflux.close()

Environment Variables

The SDK can be configured entirely via environment variables:

export LOGFLUX_API_KEY="eu-lf_your_key"
export LOGFLUX_ENVIRONMENT="production"
export LOGFLUX_NODE="worker-1"
from logflux import LogFlux
LogFlux.init_from_env()
Variable Description Default
LOGFLUX_API_KEY API key (required) --
LOGFLUX_ENVIRONMENT Environment name ""
LOGFLUX_NODE Node/host identifier hostname
LOGFLUX_LOG_GROUP Log group None
LOGFLUX_QUEUE_SIZE Queue capacity 1000
LOGFLUX_FLUSH_INTERVAL Flush interval (seconds) 5
LOGFLUX_BATCH_SIZE Entries per batch 100
LOGFLUX_WORKER_COUNT Background workers 2
LOGFLUX_MAX_RETRIES Max retries 3
LOGFLUX_HTTP_TIMEOUT HTTP timeout (seconds) 30
LOGFLUX_FAILSAFE_MODE Swallow errors true
LOGFLUX_ENABLE_COMPRESSION Gzip before encrypt true
LOGFLUX_DEBUG Debug logging false

Distributed Tracing

from logflux import LogFlux

# Start a root span
span = LogFlux.start_span("http.request", "GET /api/users")
span.set_attribute("http.method", "GET")

# Create child spans
db_span = span.start_child("db.query", "SELECT * FROM users")
db_span.set_attribute("db.type", "postgres")
LogFlux.send_span(db_span)

span.set_status("ok")
LogFlux.send_span(span)

Header Propagation

# Server receives headers from upstream
span = LogFlux.continue_from_headers(
    request.headers,
    "api.handle",
    "Process request",
)
# Pass trace context downstream
downstream_headers = {"X-LogFlux-Trace": span.to_trace_header()}

Breadcrumbs

LogFlux.add_breadcrumb("navigation", "User opened settings")
LogFlux.add_breadcrumb("http", "POST /api/save", {"status": "200"})

# Breadcrumbs are automatically included in capture_error()
try:
    dangerous_operation()
except Exception as e:
    LogFlux.capture_error(e)  # includes breadcrumb trail

Scopes

def handle_request(request):
    def configure_scope(scope):
        scope.set_user(request.user_id)
        scope.set_request("GET", request.path, request.id)
        scope.set_attribute("tenant", request.tenant)

    LogFlux.with_scope(configure_scope)

BeforeSend Hooks

from logflux import Options

def filter_sensitive(entry):
    # Return None to drop the entry
    if "password" in entry.get("message", ""):
        return None
    return entry

LogFlux.init(Options(
    api_key="eu-lf_...",
    before_send=filter_sensitive,
))

Django Middleware Example

# myapp/middleware.py
from logflux import LogFlux
import time

class LogFluxMiddleware:
    def __init__(self, get_response):
        self.get_response = get_response

    def __call__(self, request):
        start = time.time()
        span = LogFlux.start_span("http.request", f"{request.method} {request.path}")

        response = self.get_response(request)

        duration = (time.time() - start) * 1000
        span.set_attribute("http.status", str(response.status_code))
        span.set_attribute("http.duration_ms", f"{duration:.1f}")
        LogFlux.send_span(span)

        return response

Flask Example

from flask import Flask, g, request
from logflux import LogFlux
import time

app = Flask(__name__)

@app.before_request
def before_request():
    g.span = LogFlux.start_span("http.request", f"{request.method} {request.path}")
    g.start_time = time.time()

@app.after_request
def after_request(response):
    if hasattr(g, "span"):
        duration = (time.time() - g.start_time) * 1000
        g.span.set_attribute("http.status", str(response.status_code))
        g.span.set_attribute("http.duration_ms", f"{duration:.1f}")
        LogFlux.send_span(g.span)
    return response

FastAPI Example

from fastapi import FastAPI, Request
from logflux import LogFlux
import time

app = FastAPI()

@app.middleware("http")
async def logflux_middleware(request: Request, call_next):
    span = LogFlux.start_span("http.request", f"{request.method} {request.url.path}")
    start = time.time()

    response = await call_next(request)

    duration = (time.time() - start) * 1000
    span.set_attribute("http.status", str(response.status_code))
    span.set_attribute("http.duration_ms", f"{duration:.1f}")
    LogFlux.send_span(span)

    return response

Entry Types

Type Code Method Description
Log 1 info(), error(), etc. Application logs (all severity levels)
Metric 2 counter(), gauge() Numeric measurements
Trace 3 send_span() Distributed tracing spans
Event 4 event() Business/application events
Audit 5 audit() Compliance audit trail (Object Lock)
Telemetry 6 telemetry() IoT/device sensor readings

API Key Format

API keys follow the format <region>-lf_<key>:

  • eu-lf_abc123 (Europe)
  • us-lf_xyz789 (United States)

Valid regions: eu, us, ca, au, ap

Development

# Build and test (Docker only)
make docker-build
make test
make build

# Interactive shell
make shell

License

Elastic License 2.0 (ELv2)

Copyright 2026 LogFlux.io

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

logflux_sdk-3.0.0.tar.gz (38.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

logflux_sdk-3.0.0-py3-none-any.whl (34.3 kB view details)

Uploaded Python 3

File details

Details for the file logflux_sdk-3.0.0.tar.gz.

File metadata

  • Download URL: logflux_sdk-3.0.0.tar.gz
  • Upload date:
  • Size: 38.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.13

File hashes

Hashes for logflux_sdk-3.0.0.tar.gz
Algorithm Hash digest
SHA256 46741b29c52f3cd63232e37a2f187a3bbe9c0f2ae6055bca9b1c560f941895d4
MD5 4c39ee59d0da2b43a00c0f82d9dbfbee
BLAKE2b-256 670576e4cfb1fcd7e26f86c55e46c5c0c11ba1de656c8cbc4450d05f50f5086f

See more details on using hashes here.

File details

Details for the file logflux_sdk-3.0.0-py3-none-any.whl.

File metadata

  • Download URL: logflux_sdk-3.0.0-py3-none-any.whl
  • Upload date:
  • Size: 34.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.13

File hashes

Hashes for logflux_sdk-3.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 24aa8a4d8030c3a304b68c1da960184cc630cd308d6e51c48d16c7bf3af07334
MD5 c85a8e99a3f3ca7438d2e7ff04727e53
BLAKE2b-256 41825fd50e8fddb1ed42d365e87eb608dfd06866d5d5d9dd69d5cb6cffb398de

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page