Skip to main content

Official Python SDK for LogFlux.io -- end-to-end encrypted log ingestion

Project description

LogFlux Python SDK

Python Version License

Official Python SDK for LogFlux.io -- end-to-end encrypted log ingestion with zero-knowledge architecture.

Features

  • End-to-end encryption: AES-256-GCM + RSA key exchange -- your server never sees plaintext
  • All entry types: Logs, Metrics, Events, Traces, Audit, Telemetry
  • Multipart/mixed transport: Binary ciphertext over HTTP (no base64 overhead)
  • Background workers: Non-blocking async queue with daemon threads
  • Breadcrumbs: Automatic trail for error context
  • Distributed tracing: Span creation, child spans, header propagation
  • Sampling: Configurable sample rate (audit entries exempt)
  • BeforeSend hooks: Drop or modify entries before sending
  • Zero dependencies beyond cryptography (industry standard)

Installation

pip install logflux-sdk

Quick Start

from logflux import LogFlux, Options

# Initialize
LogFlux.init(Options(
    api_key="eu-lf_your_api_key_here",
    node="my-app",
    environment="production",
))

# Log messages
LogFlux.info("Server started", {"port": "8080"})
LogFlux.warn("High memory usage", {"percent": "92"})

# Metrics
LogFlux.counter("requests_total", 1, {"method": "GET"})
LogFlux.gauge("memory_mb", 512.0)

# Events
LogFlux.event("user.signup", {"plan": "pro"})

# Audit (never sampled -- compliance guarantee)
LogFlux.audit("delete", "admin@example.com", "user", "u-123")

# Error capture with breadcrumbs
try:
    raise ValueError("something went wrong")
except Exception as e:
    LogFlux.capture_error(e, {"request_id": "abc"})

# Cleanup
LogFlux.close()

Module-Level API

import logflux
from logflux import Options

logflux.init(Options(api_key="eu-lf_..."))
logflux.info("hello")
logflux.close()

Environment Variables

The SDK can be configured entirely via environment variables:

export LOGFLUX_API_KEY="eu-lf_your_key"
export LOGFLUX_ENVIRONMENT="production"
export LOGFLUX_NODE="worker-1"
from logflux import LogFlux
LogFlux.init_from_env()
Variable Description Default
LOGFLUX_API_KEY API key (required) --
LOGFLUX_ENVIRONMENT Environment name ""
LOGFLUX_NODE Node/host identifier hostname
LOGFLUX_LOG_GROUP Log group None
LOGFLUX_QUEUE_SIZE Queue capacity 1000
LOGFLUX_FLUSH_INTERVAL Flush interval (seconds) 5
LOGFLUX_BATCH_SIZE Entries per batch 100
LOGFLUX_WORKER_COUNT Background workers 2
LOGFLUX_MAX_RETRIES Max retries 3
LOGFLUX_HTTP_TIMEOUT HTTP timeout (seconds) 30
LOGFLUX_FAILSAFE_MODE Swallow errors true
LOGFLUX_ENABLE_COMPRESSION Gzip before encrypt true
LOGFLUX_DEBUG Debug logging false

Distributed Tracing

from logflux import LogFlux

# Start a root span
span = LogFlux.start_span("http.request", "GET /api/users")
span.set_attribute("http.method", "GET")

# Create child spans
db_span = span.start_child("db.query", "SELECT * FROM users")
db_span.set_attribute("db.type", "postgres")
LogFlux.send_span(db_span)

span.set_status("ok")
LogFlux.send_span(span)

Header Propagation

# Server receives headers from upstream
span = LogFlux.continue_from_headers(
    request.headers,
    "api.handle",
    "Process request",
)
# Pass trace context downstream
downstream_headers = {"X-LogFlux-Trace": span.to_trace_header()}

Breadcrumbs

LogFlux.add_breadcrumb("navigation", "User opened settings")
LogFlux.add_breadcrumb("http", "POST /api/save", {"status": "200"})

# Breadcrumbs are automatically included in capture_error()
try:
    dangerous_operation()
except Exception as e:
    LogFlux.capture_error(e)  # includes breadcrumb trail

Scopes

def handle_request(request):
    def configure_scope(scope):
        scope.set_user(request.user_id)
        scope.set_request("GET", request.path, request.id)
        scope.set_attribute("tenant", request.tenant)

    LogFlux.with_scope(configure_scope)

BeforeSend Hooks

from logflux import Options

def filter_sensitive(entry):
    # Return None to drop the entry
    if "password" in entry.get("message", ""):
        return None
    return entry

LogFlux.init(Options(
    api_key="eu-lf_...",
    before_send=filter_sensitive,
))

Django Middleware Example

# myapp/middleware.py
from logflux import LogFlux
import time

class LogFluxMiddleware:
    def __init__(self, get_response):
        self.get_response = get_response

    def __call__(self, request):
        start = time.time()
        span = LogFlux.start_span("http.request", f"{request.method} {request.path}")

        response = self.get_response(request)

        duration = (time.time() - start) * 1000
        span.set_attribute("http.status", str(response.status_code))
        span.set_attribute("http.duration_ms", f"{duration:.1f}")
        LogFlux.send_span(span)

        return response

Flask Example

from flask import Flask, g, request
from logflux import LogFlux
import time

app = Flask(__name__)

@app.before_request
def before_request():
    g.span = LogFlux.start_span("http.request", f"{request.method} {request.path}")
    g.start_time = time.time()

@app.after_request
def after_request(response):
    if hasattr(g, "span"):
        duration = (time.time() - g.start_time) * 1000
        g.span.set_attribute("http.status", str(response.status_code))
        g.span.set_attribute("http.duration_ms", f"{duration:.1f}")
        LogFlux.send_span(g.span)
    return response

FastAPI Example

from fastapi import FastAPI, Request
from logflux import LogFlux
import time

app = FastAPI()

@app.middleware("http")
async def logflux_middleware(request: Request, call_next):
    span = LogFlux.start_span("http.request", f"{request.method} {request.url.path}")
    start = time.time()

    response = await call_next(request)

    duration = (time.time() - start) * 1000
    span.set_attribute("http.status", str(response.status_code))
    span.set_attribute("http.duration_ms", f"{duration:.1f}")
    LogFlux.send_span(span)

    return response

Entry Types

Type Code Method Description
Log 1 info(), error(), etc. Application logs (all severity levels)
Metric 2 counter(), gauge() Numeric measurements
Trace 3 send_span() Distributed tracing spans
Event 4 event() Business/application events
Audit 5 audit() Compliance audit trail (Object Lock)
Telemetry 6 telemetry() IoT/device sensor readings

API Key Format

API keys follow the format <region>-lf_<key>:

  • eu-lf_abc123 (Europe)
  • us-lf_xyz789 (United States)

Valid regions: eu, us, ca, au, ap

Development

# Build and test (Docker only)
make docker-build
make test
make build

# Interactive shell
make shell

License

Elastic License 2.0 (ELv2)

Copyright 2026 LogFlux.io

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

logflux_sdk-3.0.1.tar.gz (38.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

logflux_sdk-3.0.1-py3-none-any.whl (34.3 kB view details)

Uploaded Python 3

File details

Details for the file logflux_sdk-3.0.1.tar.gz.

File metadata

  • Download URL: logflux_sdk-3.0.1.tar.gz
  • Upload date:
  • Size: 38.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.3

File hashes

Hashes for logflux_sdk-3.0.1.tar.gz
Algorithm Hash digest
SHA256 8e90fcf437ca06580407e6a4adb1600874f25365e5c305a5a90f2b92ce4f079c
MD5 9dcf1a4d39ccb7da244d3db9e38709ec
BLAKE2b-256 7c7b7a6796f3120c317596449ba4f43ea01c818cad8948bfe4efd9747f70af3a

See more details on using hashes here.

File details

Details for the file logflux_sdk-3.0.1-py3-none-any.whl.

File metadata

  • Download URL: logflux_sdk-3.0.1-py3-none-any.whl
  • Upload date:
  • Size: 34.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.3

File hashes

Hashes for logflux_sdk-3.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 a53ceb79123dcd816bf3c97be8a064a77861d91fd593def3b32c3fa69fbbc8e7
MD5 1e5070a1208592ccade83fe5059519c7
BLAKE2b-256 8d53ed94c2c517a2f1fa0ff23febae745ffb1bdf52af887a9ad5471675a5d4bd

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page