Skip to main content

Superstream optimisation library for Kafka producers

Project description

Superclient Python

A Python library for automatically optimizing Kafka producer configurations based on topic-specific recommendations.

Overview

Superstream Clients works as a Python import hook that intercepts Kafka producer creation and applies optimized configurations without requiring any code changes in your application. It dynamically retrieves optimization recommendations from Superstream and applies them based on impact analysis.

Supported Libraries

Works with any Python library that implements Kafka producers, including:

  • kafka-python
  • aiokafka
  • confluent-kafka
  • Faust
  • FastAPI event publishers
  • Celery Kafka backends
  • Any custom wrapper around these Kafka clients

Features

  • Zero-code integration: No code changes required in your application
  • Dynamic configuration: Applies optimized settings based on topic-specific recommendations
  • Intelligent optimization: Identifies the most impactful topics to optimize
  • Graceful fallback: Falls back to default settings if optimization fails
  • Minimal overhead: Uses a single lightweight background thread (or async coroutine for aiokafka)

Important: Producer Configuration Requirements

When initializing your Kafka producers, please ensure you pass the configuration as a mutable object. The Superstream library needs to modify the producer configuration to apply optimizations. The following initialization patterns are supported:

Supported (Recommended):

# Using kafka-python
from kafka import KafkaProducer
producer = KafkaProducer(
    bootstrap_servers=['localhost:9092'],
    compression_type='snappy',
    batch_size=16384
)

# Using aiokafka
from aiokafka import AIOKafkaProducer
producer = AIOKafkaProducer(
    bootstrap_servers='localhost:9092',
    compression_type='snappy',
    batch_size=16384
)

# Using confluent-kafka
from confluent_kafka import Producer
producer = Producer({
    'bootstrap.servers': 'localhost:9092',
    'compression.type': 'snappy',
    'batch.size': 16384
})

Not Supported:

# Using frozen dictionaries or immutable configurations
from types import MappingProxyType
config = MappingProxyType({
    'bootstrap.servers': 'localhost:9092'
})
producer = KafkaProducer(**config)

Why This Matters

The Superstream library needs to modify your producer's configuration to apply optimizations based on your cluster's characteristics. This includes adjusting settings like compression, batch size, and other performance parameters. When the configuration is immutable, these optimizations cannot be applied.

Installation

pip install superstream-clients && python -m superclient install_pth

That's it! Superclient will now automatically load and optimize all Kafka producers in your Python environment.

Usage

After installation, superclient works automatically. Just use your Kafka clients as usual:

# kafka-python
from kafka import KafkaProducer
producer = KafkaProducer(bootstrap_servers='localhost:9092')
# Automatically optimized!

# confluent-kafka
from confluent_kafka import Producer
producer = Producer({'bootstrap.servers': 'localhost:9092'})
# Automatically optimized!

# aiokafka
from aiokafka import AIOKafkaProducer
producer = AIOKafkaProducer(bootstrap_servers='localhost:9092')
# Automatically optimized!

Docker Integration

When using Superstream Clients with containerized applications, include the package in your Dockerfile:

FROM python:3.8-slim

# Install superclient
RUN pip install superstream-clients
RUN python -m superclient install_pth

# Your application code
COPY . /app
WORKDIR /app

# Run your application
CMD ["python", "your_app.py"]

Required Environment Variables

  • SUPERSTREAM_TOPICS_LIST: Comma-separated list of topics your application produces to

Optional Environment Variables

  • SUPERSTREAM_LATENCY_SENSITIVE: Set to "true" to prevent any modification to linger.ms values
  • SUPERSTREAM_DISABLED: Set to "true" to disable optimization
  • SUPERSTREAM_DEBUG: Set to "true" to enable debug logs

Example:

export SUPERSTREAM_TOPICS_LIST=orders,payments,user-events
export SUPERSTREAM_LATENCY_SENSITIVE=true

Prerequisites

  • Python 3.8 or higher
  • Kafka cluster that is connected to the Superstream's console
  • Read and write permissions to the superstream.* topics

License

Apache License 2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

superstream_clients_beta-0.1.5.tar.gz (23.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

superstream_clients_beta-0.1.5-py3-none-any.whl (27.1 kB view details)

Uploaded Python 3

File details

Details for the file superstream_clients_beta-0.1.5.tar.gz.

File metadata

File hashes

Hashes for superstream_clients_beta-0.1.5.tar.gz
Algorithm Hash digest
SHA256 57cb4ce05c41d846f972d8d6546d07d4b0e8bc8a412b6ce1daf25c1520b4ef1c
MD5 d8cfbe1d40bb07e8bf31bf02be653e6f
BLAKE2b-256 97caa0b18d037af4feec63e73f3ac34930c4cea6dc1e6a7ffde31e6c36892784

See more details on using hashes here.

File details

Details for the file superstream_clients_beta-0.1.5-py3-none-any.whl.

File metadata

File hashes

Hashes for superstream_clients_beta-0.1.5-py3-none-any.whl
Algorithm Hash digest
SHA256 d2dee658286f9934a05f91e7bbd9bb295f6a35a160a333c5cfebe601634c5ad6
MD5 4c85b3226f463a347b2bbf5af8b039f9
BLAKE2b-256 13db08362f2be17f0b9a24042b082a08e86dcf56efbd3424f9425785496a15ac

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page