Skip to main content

Kafka clients

Project description

dgkafka

Python package for working with Apache Kafka supporting multiple data formats.

Installation

pip install dgkafka

For Avro support (requires additional dependencies):

pip install dgkafka[avro]

For Json support (requires additional dependencies):

pip install dgkafka[json]

Features

  • Producers and consumers for different data formats:
    • Raw messages (bytes/strings)
    • JSON
    • Avro (with Schema Registry integration)
  • Robust error handling
  • Comprehensive operation logging
  • Context manager support
  • Flexible configuration

Quick Start

Basic Producer/Consumer

from dgkafka import KafkaProducer, KafkaConsumer

# Producer
with KafkaProducer(bootstrap_servers='localhost:9092') as producer:
    producer.produce('test_topic', 'Hello, Kafka!')

# Consumer
with KafkaConsumer(bootstrap_servers='localhost:9092', group_id='test_group') as consumer:
    consumer.subscribe(['test_topic'])
    for msg in consumer.consume():
        print(msg.value())

JSON Support

from dgkafka import JsonKafkaProducer, JsonKafkaConsumer

# Producer
with JsonKafkaProducer(bootstrap_servers='localhost:9092') as producer:
    producer.produce('json_topic', {'key': 'value'})

# Consumer
with JsonKafkaConsumer(bootstrap_servers='localhost:9092', group_id='json_group') as consumer:
    consumer.subscribe(['json_topic'])
    for msg in consumer.consume():
        print(msg.value())  # Automatically deserialized JSON

Avro Support

from dgkafka import AvroKafkaProducer, AvroKafkaConsumer

# Producer
value_schema = {
    "type": "record",
    "name": "User",
    "fields": [
        {"name": "name", "type": "string"},
        {"name": "age", "type": "int"}
    ]
}

with AvroKafkaProducer(
    schema_registry_url='http://localhost:8081',
    bootstrap_servers='localhost:9092',
    default_value_schema=value_schema
) as producer:
    producer.produce('avro_topic', {'name': 'Alice', 'age': 30})

# Consumer
with AvroKafkaConsumer(
    schema_registry_url='http://localhost:8081',
    bootstrap_servers='localhost:9092',
    group_id='avro_group'
) as consumer:
    consumer.subscribe(['avro_topic'])
    for msg in consumer.consume():
        print(msg.value())  # Automatically deserialized Avro object

Classes

Base Classes

  • KafkaProducer - base message producer
  • KafkaConsumer - base message consumer

Specialized Classes

  • JsonKafkaProducer - JSON message producer (inherits from KafkaProducer)
  • JsonKafkaConsumer - JSON message consumer (inherits from KafkaConsumer)
  • AvroKafkaProducer - Avro message producer (inherits from KafkaProducer)
  • AvroKafkaConsumer - Avro message consumer (inherits from KafkaConsumer)

Configuration

All classes accept standard Kafka configuration parameters:

config = {
    'bootstrap.servers': 'localhost:9092',
    'group.id': 'my_group',
    'auto.offset.reset': 'earliest'
}

Avro classes require additional parameter:

  • schema_registry_url - Schema Registry URL

Logging

All classes use dglog.Logger for logging. You can provide a custom logger:

from dglog import Logger

logger = Logger()
producer = KafkaProducer(logger_=logger, ...)

Best Practices

  1. Always use context managers (with) for proper resource cleanup
  2. Implement error handling and retry logic for production use
  3. Pre-register Avro schemas in Schema Registry
  4. Configure appropriate acks and retries parameters for producers
  5. Monitor consumer lag and producer throughput

Advanced Usage

Custom Serialization

# Custom Avro serializer
class CustomAvroProducer(AvroKafkaProducer):
    def _serialize_value(self, value):
        # Custom serialization logic
        return super()._serialize_value(value)

Message Headers

# Adding headers to messages
headers = {
    'correlation_id': '12345',
    'message_type': 'user_update'
}

producer.produce(
    topic='events',
    value=message_data,
    headers=headers
)

Error Handling

from confluent_kafka import KafkaException

try:
    with AvroKafkaProducer(...) as producer:
        producer.produce(...)
except KafkaException as e:
    print(f"Kafka error occurred: {e}")

Performance Tips

  1. Batch messages when possible (batch.num.messages config)
  2. Adjust linger.ms for better batching
  3. Use compression.type (lz4, snappy, or gzip)
  4. Tune fetch.max.bytes and max.partition.fetch.bytes for consumers

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dgkafka-1.0.2a1.tar.gz (12.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

dgkafka-1.0.2a1-py3-none-any.whl (13.3 kB view details)

Uploaded Python 3

File details

Details for the file dgkafka-1.0.2a1.tar.gz.

File metadata

  • Download URL: dgkafka-1.0.2a1.tar.gz
  • Upload date:
  • Size: 12.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.5

File hashes

Hashes for dgkafka-1.0.2a1.tar.gz
Algorithm Hash digest
SHA256 537802bbbcf311182bfffdd30cc7cfa2608059c7c8d388d7185f43021b2499ae
MD5 b483a0e83de677152ce8f3355d1cb499
BLAKE2b-256 cc39c14dd8c7e9cf0da2f60bb03c3d45595e6d05eabc2837a15c3b6a50f5f7bf

See more details on using hashes here.

File details

Details for the file dgkafka-1.0.2a1-py3-none-any.whl.

File metadata

  • Download URL: dgkafka-1.0.2a1-py3-none-any.whl
  • Upload date:
  • Size: 13.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.5

File hashes

Hashes for dgkafka-1.0.2a1-py3-none-any.whl
Algorithm Hash digest
SHA256 b40cdaca302481ae62842949cb0e389a0a75f0138584b1b691eec1c0c7bbc853
MD5 287f7b1809c68680b449cb824cb03b1b
BLAKE2b-256 6e3ffa25107bf76e28a8340165a0815ca608bedcdf7b87450ed8617bf9911501

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page