Skip to main content

Superstream optimisation library for Kafka producers

Project description

Superclient Python

A Python library for automatically optimizing Kafka producer configurations based on topic-specific recommendations.

Overview

Superstream Clients works as a Python import hook that intercepts Kafka producer creation and applies optimized configurations without requiring any code changes in your application. It dynamically retrieves optimization recommendations from Superstream and applies them based on impact analysis.

Supported Libraries

Works with any Python library that implements Kafka producers, including:

  • kafka-python
  • aiokafka
  • confluent-kafka
  • Faust
  • FastAPI event publishers
  • Celery Kafka backends
  • Any custom wrapper around these Kafka clients

Features

  • Zero-code integration: No code changes required in your application
  • Dynamic configuration: Applies optimized settings based on topic-specific recommendations
  • Intelligent optimization: Identifies the most impactful topics to optimize
  • Graceful fallback: Falls back to default settings if optimization fails
  • Minimal overhead: Uses a single lightweight background thread (or async coroutine for aiokafka)

Important: Producer Configuration Requirements

When initializing your Kafka producers, please ensure you pass the configuration as a mutable object. The Superstream library needs to modify the producer configuration to apply optimizations. The following initialization patterns are supported:

Supported (Recommended):

# Using kafka-python
from kafka import KafkaProducer
producer = KafkaProducer(
    bootstrap_servers=['localhost:9092'],
    compression_type='snappy',
    batch_size=16384
)

# Using aiokafka
from aiokafka import AIOKafkaProducer
producer = AIOKafkaProducer(
    bootstrap_servers='localhost:9092',
    compression_type='snappy',
    batch_size=16384
)

# Using confluent-kafka
from confluent_kafka import Producer
producer = Producer({
    'bootstrap.servers': 'localhost:9092',
    'compression.type': 'snappy',
    'batch.size': 16384
})

Not Supported:

# Using frozen dictionaries or immutable configurations
from types import MappingProxyType
config = MappingProxyType({
    'bootstrap.servers': 'localhost:9092'
})
producer = KafkaProducer(**config)

Why This Matters

The Superstream library needs to modify your producer's configuration to apply optimizations based on your cluster's characteristics. This includes adjusting settings like compression, batch size, and other performance parameters. When the configuration is immutable, these optimizations cannot be applied.

Installation

pip install superstream-clients && python -m superclient install_pth

That's it! Superclient will now automatically load and optimize all Kafka producers in your Python environment.

Usage

After installation, superclient works automatically. Just use your Kafka clients as usual:

# kafka-python
from kafka import KafkaProducer
producer = KafkaProducer(bootstrap_servers='localhost:9092')
# Automatically optimized!

# confluent-kafka
from confluent_kafka import Producer
producer = Producer({'bootstrap.servers': 'localhost:9092'})
# Automatically optimized!

# aiokafka
from aiokafka import AIOKafkaProducer
producer = AIOKafkaProducer(bootstrap_servers='localhost:9092')
# Automatically optimized!

Docker Integration

When using Superstream Clients with containerized applications, include the package in your Dockerfile:

FROM python:3.8-slim

# Install superclient
RUN pip install superstream-clients
RUN python -m superclient install_pth

# Your application code
COPY . /app
WORKDIR /app

# Run your application
CMD ["python", "your_app.py"]

Required Environment Variables

  • SUPERSTREAM_TOPICS_LIST: Comma-separated list of topics your application produces to

Optional Environment Variables

  • SUPERSTREAM_LATENCY_SENSITIVE: Set to "true" to prevent any modification to linger.ms values
  • SUPERSTREAM_DISABLED: Set to "true" to disable optimization
  • SUPERSTREAM_DEBUG: Set to "true" to enable debug logs

Example:

export SUPERSTREAM_TOPICS_LIST=orders,payments,user-events
export SUPERSTREAM_LATENCY_SENSITIVE=true

Prerequisites

  • Python 3.8 or higher
  • Kafka cluster that is connected to the Superstream's console
  • Read and write permissions to the superstream.* topics

License

Apache License 2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

superstream_clients_beta-0.1.4.tar.gz (23.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

superstream_clients_beta-0.1.4-py3-none-any.whl (27.0 kB view details)

Uploaded Python 3

File details

Details for the file superstream_clients_beta-0.1.4.tar.gz.

File metadata

File hashes

Hashes for superstream_clients_beta-0.1.4.tar.gz
Algorithm Hash digest
SHA256 200dd9f56317aa86b811b21f9d86aa8e00347b7bc5ecb1c65dc8dc99e4e3b859
MD5 08072cebf679537f680a0983e96cef05
BLAKE2b-256 445f16a3dd46df7c7bfece7ce26d346b72dc647c812b6d4fe8d3d82968393088

See more details on using hashes here.

File details

Details for the file superstream_clients_beta-0.1.4-py3-none-any.whl.

File metadata

File hashes

Hashes for superstream_clients_beta-0.1.4-py3-none-any.whl
Algorithm Hash digest
SHA256 2734207daa89043cda132427258e833753c1faeb12d2366edd7e25e389ff7823
MD5 6da30c29fed10dcb23d3bb3ba8a90e66
BLAKE2b-256 a4b09cfd878d73b1262addee165d6e35dea9db92c3035919f674fc27ebcbd27f

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page