Skip to main content

A simple, user-friendly Python client for sending events and traces to Honeycomb, with connection management and decorator support for timing and custom fields.

Project description

Honeycomb Python Client

A simple, user-friendly wrapper for sending events to Honeycomb using libhoney, with built-in connection management to prevent file descriptor exhaustion in high-throughput environments.

Installation

Copy the honeycomb_client.py file (and __init__.py) into your project, or install as a public package if set up.

Usage

from capture_hc.honeycomb_client import HoneycombClient

# Initialize the client (use your own writekey and dataset)
honey = HoneycombClient(
    writekey="<YOUR_WRITE_KEY>", 
    dataset="<YOUR_DATASET>",
    batch_size=50,        # Flush after 50 events (default)
    flush_interval=5.0    # Or flush every 5 seconds (default)
)

# Send an event (fields as a dictionary)
honey.send_event({
    "alert_name": "Test Alert",
    "priority": "P1",
    "message": "Something happened!",
    "duration_ms": 123.4
})

# Use the timed decorator to automatically measure and send duration
@honey.timed({"alert_name": "important_task"})
def important_task(x, y, event=None):
    # ... your code ...
    event.add_field("result", x + y)
    return x + y

important_task(1, 2)

# You can also customize the event variable name
@honey.timed({"alert_name": "custom_var"}, event_arg='track')
def another_task(x, y, track=None):
    track.add_field("custom_field", x * y)
    return x * y

another_task(2, 3)

Features

  • Connection Management: Prevents "Too many open files" errors with intelligent batching
  • Batch Processing: Configurable batch size and flush intervals to optimize performance
  • Thread-Safe: Safe for concurrent use in multi-threaded environments
  • Simple send_event(dict) interface
  • All fields are added to the event
  • @honey.timed({...}) decorator to measure and send function execution time automatically
  • Automatic cleanup on process exit

Connection Management

The client automatically manages connections to prevent file descriptor exhaustion:

  • Batch Size: Events are batched and sent together (default: 50 events)
  • Flush Interval: Events are flushed after a time interval (default: 5 seconds)
  • Connection Pooling: Limits concurrent connections (max 10 by default)
  • Thread Safety: All operations are thread-safe

Tuning for High-Throughput

For environments with many concurrent tasks (like Airflow):

# For high-throughput environments
honey = HoneycombClient(
    writekey="<YOUR_WRITE_KEY>",
    dataset="<YOUR_DATASET>",
    batch_size=100,       # Larger batches
    flush_interval=10.0   # Longer intervals
)

# Force flush when needed
honey.flush()

# Clean up when done
honey.close()

Advanced

You can pass debug=True to enable debug logging

Integration Example

Set your environment variables and run the script to send a test event and a timed event:

export HONEYCOMB_WRITEKEY=your_writekey
export HONEYCOMB_DATASET=your_dataset
python -m integration_example

This will:

  • Send a simple event to Honeycomb
  • Use the @honey.timed decorator to send a timed event with custom fields

Example from integration_example.py:

@honey.timed({'alert_name': 'decorator_test'})
def test_func(event=None):
    event.add_field('custom_field', 123)
    return 'decorator event sent!'

Lazy decorator (module or class)

Use a one-liner lazy decorator that initializes at call-time and works with legacy functions:

from capture_hc import lazy_timed

@lazy_timed(extra_fields={'alert_name': 'legacy_task'}, event_arg=None)
def legacy_task(x, y):
    return x + y

Or use it via the class for discoverability:

from capture_hc import HoneycombClient

@HoneycombClient.lazy_timed(extra_fields={'alert_name': 'legacy_task'}, event_arg=None)
def legacy_task(x, y):
    return x + y
  • Credentials resolve from env by default (HONEYCOMB_WRITEKEY, HONEYCOMB_DATASET).
  • Set event_arg=None if your function can't accept the event kwarg.

Airflow note

  • Prefer lazy decorators or initialize the client inside the task function to avoid DAG-parse-time side effects.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

capture_hc-0.2.0.tar.gz (7.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

capture_hc-0.2.0-py3-none-any.whl (6.9 kB view details)

Uploaded Python 3

File details

Details for the file capture_hc-0.2.0.tar.gz.

File metadata

  • Download URL: capture_hc-0.2.0.tar.gz
  • Upload date:
  • Size: 7.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.9.6

File hashes

Hashes for capture_hc-0.2.0.tar.gz
Algorithm Hash digest
SHA256 a85c9697ef91c076f29f46094be748a8fa042152c0213d9fa9cccd917a5e9267
MD5 760af0267b87fc187b50430d68fddc85
BLAKE2b-256 4fc8be32b81627b25beff5c581d36d36deb5a96fd1efa681061e48d0bd56685d

See more details on using hashes here.

File details

Details for the file capture_hc-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: capture_hc-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 6.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.9.6

File hashes

Hashes for capture_hc-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 6c79055a9d7360484ff64a7e2aea83a8caa4e51c42de43cfb2c23ac40f99f6ba
MD5 eaed999642987bf3853ba84c2217dfa5
BLAKE2b-256 f4809ca9e8d20178184e05613f5ca3dde04d8b6675502618fceaf3410e5d7e2b

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page