Official Python SDK for LogTide - Self-hosted log management with async client, logging integration, batching, retry, circuit breaker, and middleware
Project description
LogTide Python SDK
Official Python SDK for LogTide — self-hosted log management with async client, logging integration, batching, retry, circuit breaker, and middleware.
Features
- Sync & async clients —
LogTideClient(requests) andAsyncLogTideClient(aiohttp) - stdlib
loggingintegration — drop-inLogTideHandlerfor existing logging setups - Automatic batching with configurable size and interval
- Retry logic with exponential backoff
- Circuit breaker pattern for fault tolerance
- Payload limits — field truncation, base64 removal, field exclusion, max entry size
- Max buffer size with silent drop policy to prevent memory leaks
- Query API for searching and filtering logs
- Live tail with Server-Sent Events (SSE)
- Trace ID context for distributed tracing
- Global metadata added to all logs
- Structured exception serialization with parsed stack frames
- Internal metrics (logs sent, errors, latency, circuit breaker trips)
- Flask, Django, FastAPI & Starlette middleware for auto-logging HTTP requests
- Full Python 3.8+ support with type hints
Requirements
- Python 3.8 or higher
Installation
pip install logtide-sdk
Optional Dependencies
# Async client (AsyncLogTideClient)
pip install logtide-sdk[async]
# Flask middleware
pip install logtide-sdk[flask]
# Django middleware
pip install logtide-sdk[django]
# FastAPI middleware
pip install logtide-sdk[fastapi]
# Starlette middleware (standalone, without FastAPI)
pip install logtide-sdk[starlette]
# Install all extras
pip install logtide-sdk[async,flask,django,fastapi,starlette]
Quick Start
from logtide_sdk import LogTideClient, ClientOptions
client = LogTideClient(
ClientOptions(
api_url='http://localhost:8080',
api_key='lp_your_api_key_here',
)
)
client.info('api-gateway', 'Server started', {'port': 3000})
client.error('database', 'Connection failed', Exception('Timeout'))
# Graceful shutdown (also registered automatically via atexit)
client.close()
Configuration Options
Basic Options
| Option | Type | Default | Description |
|---|---|---|---|
api_url |
str |
required | Base URL of your LogTide instance |
api_key |
str |
required | Project API key (starts with lp_) |
batch_size |
int |
100 |
Logs per batch before an immediate flush |
flush_interval |
int |
5000 |
Auto-flush interval in ms |
Advanced Options
| Option | Type | Default | Description |
|---|---|---|---|
max_buffer_size |
int |
10000 |
Max buffered logs; excess are silently dropped |
max_retries |
int |
3 |
Max retry attempts on send failure |
retry_delay_ms |
int |
1000 |
Initial retry delay (doubles each attempt) |
circuit_breaker_threshold |
int |
5 |
Consecutive failures before opening circuit |
circuit_breaker_reset_ms |
int |
30000 |
Time before testing a half-open circuit |
debug |
bool |
False |
Print debug output to console |
global_metadata |
dict |
{} |
Metadata merged into every log entry |
auto_trace_id |
bool |
False |
Auto-generate a UUID trace ID per log |
payload_limits |
PayloadLimitsOptions |
see below | Safeguards against oversized payloads |
Payload Limits
PayloadLimitsOptions prevents 413 errors from oversized entries.
| Field | Default | Description |
|---|---|---|
max_field_size |
10 * 1024 (10 KB) |
Max length of any single string field |
max_log_size |
100 * 1024 (100 KB) |
Max total serialized entry size |
exclude_fields |
[] |
Field names replaced with "[EXCLUDED]" |
truncation_marker |
"...[TRUNCATED]" |
Appended to truncated strings |
from logtide_sdk import LogTideClient, ClientOptions, PayloadLimitsOptions
client = LogTideClient(
ClientOptions(
api_url='http://localhost:8080',
api_key='lp_your_api_key_here',
payload_limits=PayloadLimitsOptions(
max_field_size=5 * 1024,
exclude_fields=['password', 'token'],
),
)
)
Base64-encoded strings (data URIs or long base64 blobs) are automatically replaced with "[BASE64 DATA REMOVED]".
Sync Client
Logging Methods
client.debug('service', 'Debug message')
client.info('service', 'Info message', {'userId': 123})
client.warn('service', 'Warning message')
client.error('service', 'Error message', {'custom': 'data'})
client.critical('service', 'Critical message')
Exception Auto-Serialization
Pass an Exception directly to error() or critical() — it is serialized automatically:
try:
raise RuntimeError('Database timeout')
except Exception as e:
client.error('database', 'Query failed', e)
Generated metadata:
{
"exception": {
"type": "RuntimeError",
"message": "Database timeout",
"language": "python",
"stacktrace": [
{"file": "app.py", "function": "run_query", "line": 42}
],
"raw": "Traceback (most recent call last):\n ..."
}
}
Async Client
AsyncLogTideClient is the async equivalent, using aiohttp. Best used as an async context manager.
pip install logtide-sdk[async]
import asyncio
from logtide_sdk import AsyncLogTideClient, ClientOptions
async def main():
async with AsyncLogTideClient(ClientOptions(
api_url='http://localhost:8080',
api_key='lp_your_api_key_here',
)) as client:
await client.info('my-service', 'Hello from async!')
await client.error('my-service', 'Something failed', Exception('oops'))
asyncio.run(main())
Manual lifecycle (without context manager):
client = AsyncLogTideClient(options)
await client.start() # starts background flush loop
try:
await client.info('svc', 'message')
finally:
await client.close()
All sync logging, query, stream, and metrics methods have async equivalents.
stdlib logging Integration
LogTideHandler is a standard logging.Handler — drop it into any existing logging setup.
import logging
from logtide_sdk import LogTideClient, ClientOptions, LogTideHandler
client = LogTideClient(ClientOptions(
api_url='http://localhost:8080',
api_key='lp_your_api_key_here',
))
handler = LogTideHandler(client=client, service='my-service')
handler.setLevel(logging.WARNING)
logger = logging.getLogger(__name__)
logger.addHandler(handler)
# These are forwarded to LogTide automatically
logger.warning('Low disk space')
logger.error('Unhandled exception', exc_info=True)
Exception info is serialized with full structured stack frames when exc_info=True is used.
Trace ID Context
Manual Trace ID
client.set_trace_id('request-123')
client.info('api', 'Request received')
client.info('db', 'Querying users')
client.info('api', 'Response sent')
client.set_trace_id(None) # clear
Scoped Trace ID (Context Manager)
with client.with_trace_id('request-456'):
client.info('api', 'Processing in context')
client.warn('cache', 'Cache miss')
# Trace ID automatically restored after block
Auto-Generated Trace ID
with client.with_new_trace_id():
client.info('worker', 'Background job started')
client.info('worker', 'Job completed')
Query API
Basic Query
from datetime import datetime, timedelta
from logtide_sdk import QueryOptions, LogLevel
result = client.query(
QueryOptions(
service='api-gateway',
level=LogLevel.ERROR,
from_time=datetime.now() - timedelta(hours=24),
to_time=datetime.now(),
limit=100,
offset=0,
)
)
print(f"Found {result.total} logs")
for log in result.logs:
print(log)
Full-Text Search
result = client.query(QueryOptions(q='timeout', limit=50))
Get Logs by Trace ID
logs = client.get_by_trace_id('trace-123')
Aggregated Statistics
from logtide_sdk import AggregatedStatsOptions
stats = client.get_aggregated_stats(
AggregatedStatsOptions(
from_time=datetime.now() - timedelta(days=7),
to_time=datetime.now(),
interval='1h',
)
)
for service in stats.top_services:
print(f"{service['service']}: {service['count']} logs")
Live Streaming (SSE)
stream() runs in a background daemon thread and returns immediately with a stop function.
def handle_log(log):
print(f"[{log['time']}] {log['level']}: {log['message']}")
stop = client.stream(
on_log=handle_log,
on_error=lambda e: print(f"Stream error: {e}"),
filters={'service': 'api-gateway', 'level': 'error'},
)
# ... later, to stop:
stop()
Async streaming runs as a cancellable coroutine:
task = asyncio.create_task(client.stream(on_log=handle_log))
# ... later:
task.cancel()
Metrics
metrics = client.get_metrics()
print(f"Logs sent: {metrics.logs_sent}")
print(f"Logs dropped: {metrics.logs_dropped}")
print(f"Errors: {metrics.errors}")
print(f"Retries: {metrics.retries}")
print(f"Avg latency: {metrics.avg_latency_ms:.1f}ms")
print(f"Circuit breaker trips: {metrics.circuit_breaker_trips}")
print(client.get_circuit_breaker_state()) # CLOSED | OPEN | HALF_OPEN
client.reset_metrics()
Middleware Integration
Flask
from flask import Flask
from logtide_sdk import LogTideClient, ClientOptions
from logtide_sdk.middleware import LogTideFlaskMiddleware
app = Flask(__name__)
client = LogTideClient(ClientOptions(
api_url='http://localhost:8080',
api_key='lp_your_api_key_here',
))
LogTideFlaskMiddleware(
app,
client=client,
service_name='flask-api',
log_requests=True,
log_responses=True,
skip_paths=['/metrics'],
)
Django
# settings.py
from logtide_sdk import LogTideClient, ClientOptions
LOGTIDE_CLIENT = LogTideClient(ClientOptions(
api_url='http://localhost:8080',
api_key='lp_your_api_key_here',
))
LOGTIDE_SERVICE_NAME = 'django-api'
MIDDLEWARE = [
'logtide_sdk.middleware.LogTideDjangoMiddleware',
# ...
]
FastAPI
from fastapi import FastAPI
from logtide_sdk import LogTideClient, ClientOptions
from logtide_sdk.middleware import LogTideFastAPIMiddleware
app = FastAPI()
client = LogTideClient(ClientOptions(
api_url='http://localhost:8080',
api_key='lp_your_api_key_here',
))
app.add_middleware(LogTideFastAPIMiddleware, client=client, service_name='fastapi-api')
Starlette (standalone)
pip install logtide-sdk[starlette]
from starlette.applications import Starlette
from logtide_sdk import LogTideClient, ClientOptions
from logtide_sdk.middleware import LogTideStarletteMiddleware
app = Starlette()
client = LogTideClient(ClientOptions(
api_url='http://localhost:8080',
api_key='lp_your_api_key_here',
))
app.add_middleware(LogTideStarletteMiddleware, client=client, service_name='starlette-api')
All middleware auto-logs requests, responses (with duration and status code), and errors (with serialized exception metadata). Health check paths (/health, /healthz) are skipped by default.
Examples
See the examples/ directory for complete working examples:
- basic.py - Simple usage
- advanced.py - All advanced features
Best Practices
Use Global Metadata
client = LogTideClient(ClientOptions(
api_url='http://localhost:8080',
api_key='lp_your_api_key_here',
global_metadata={
'env': os.getenv('APP_ENV', 'production'),
'version': '2.0.0',
'region': 'eu-west-1',
},
))
Monitor Metrics in Production
import threading
def _monitor():
while True:
m = client.get_metrics()
if m.logs_dropped > 0:
print(f"WARNING: {m.logs_dropped} logs dropped")
if m.circuit_breaker_trips > 0:
print("ERROR: Circuit breaker tripped")
time.sleep(60)
threading.Thread(target=_monitor, daemon=True).start()
Contributing
Contributions are welcome! Please see CONTRIBUTING.md for guidelines.
License
MIT License — see LICENSE for details.
Links
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file logtide_sdk-0.8.4rc1.tar.gz.
File metadata
- Download URL: logtide_sdk-0.8.4rc1.tar.gz
- Upload date:
- Size: 30.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c699df742a42b28f90ca0bd49110cf35ad939ef3565b5ee3e2f9bbd2c03b21c9
|
|
| MD5 |
820071fffc626e117b4b8ca00322a5ba
|
|
| BLAKE2b-256 |
793f86bcb35601209e5a86cdb4230dc114e24e3501cb31f6f6d55836b3ce210e
|
Provenance
The following attestation bundles were made for logtide_sdk-0.8.4rc1.tar.gz:
Publisher:
release.yml on logtide-dev/logtide-python
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
logtide_sdk-0.8.4rc1.tar.gz -
Subject digest:
c699df742a42b28f90ca0bd49110cf35ad939ef3565b5ee3e2f9bbd2c03b21c9 - Sigstore transparency entry: 1154676015
- Sigstore integration time:
-
Permalink:
logtide-dev/logtide-python@232ac3f2cc5cf2adc0f0080a7bdb39a31b1d66c3 -
Branch / Tag:
refs/tags/v0.8.4rc1 - Owner: https://github.com/logtide-dev
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@232ac3f2cc5cf2adc0f0080a7bdb39a31b1d66c3 -
Trigger Event:
push
-
Statement type:
File details
Details for the file logtide_sdk-0.8.4rc1-py3-none-any.whl.
File metadata
- Download URL: logtide_sdk-0.8.4rc1-py3-none-any.whl
- Upload date:
- Size: 29.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
af55b28cbef6d01eacc2ea477d79306ebfb5a8b055bebac1eedc64b7d09825be
|
|
| MD5 |
71a4c6424864eceb78d84f2b1dda2804
|
|
| BLAKE2b-256 |
931d1867543fd122b130bbfd8d0c775673a6a76a0dd4e87f9cf466d4be116adf
|
Provenance
The following attestation bundles were made for logtide_sdk-0.8.4rc1-py3-none-any.whl:
Publisher:
release.yml on logtide-dev/logtide-python
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
logtide_sdk-0.8.4rc1-py3-none-any.whl -
Subject digest:
af55b28cbef6d01eacc2ea477d79306ebfb5a8b055bebac1eedc64b7d09825be - Sigstore transparency entry: 1154676017
- Sigstore integration time:
-
Permalink:
logtide-dev/logtide-python@232ac3f2cc5cf2adc0f0080a7bdb39a31b1d66c3 -
Branch / Tag:
refs/tags/v0.8.4rc1 - Owner: https://github.com/logtide-dev
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@232ac3f2cc5cf2adc0f0080a7bdb39a31b1d66c3 -
Trigger Event:
push
-
Statement type: