Skip to main content

Patched version of Confluent's Python client for Apache Kafka

Project description

Confluent's Python Client for Apache KafkaTM

confluent-kafka-python provides a high-level Producer, Consumer and AdminClient compatible with all Apache KafkaTM brokers >= v0.8, Confluent Cloud and the Confluent Platform. The client is:

  • Reliable - It's a wrapper around librdkafka (provided automatically via binary wheels) which is widely deployed in a diverse set of production scenarios. It's tested using the same set of system tests as the Java client and more. It's supported by Confluent.

  • Performant - Performance is a key design consideration. Maximum throughput is on par with the Java client for larger message sizes (where the overhead of the Python interpreter has less impact). Latency is on par with the Java client.

  • Future proof - Confluent, founded by the creators of Kafka, is building a streaming platform with Apache Kafka at its core. It's high priority for us that client features keep pace with core Apache Kafka and components of the Confluent Platform.

See the API documentation for more info.

Usage

Below are some examples of typical usage. For more examples, see the examples directory or the confluentinc/examples github repo for a Confluent Cloud example.

Producer

from confluent_kafka import Producer


p = Producer({'bootstrap.servers': 'mybroker1,mybroker2'})

def delivery_report(err, msg):
    """ Called once for each message produced to indicate delivery result.
        Triggered by poll() or flush(). """
    if err is not None:
        print('Message delivery failed: {}'.format(err))
    else:
        print('Message delivered to {} [{}]'.format(msg.topic(), msg.partition()))

for data in some_data_source:
    # Trigger any available delivery report callbacks from previous produce() calls
    p.poll(0)

    # Asynchronously produce a message, the delivery report callback
    # will be triggered from poll() above, or flush() below, when the message has
    # been successfully delivered or failed permanently.
    p.produce('mytopic', data.encode('utf-8'), callback=delivery_report)

# Wait for any outstanding messages to be delivered and delivery report
# callbacks to be triggered.
p.flush()

High-level Consumer

from confluent_kafka import Consumer


c = Consumer({
    'bootstrap.servers': 'mybroker',
    'group.id': 'mygroup',
    'auto.offset.reset': 'earliest'
})

c.subscribe(['mytopic'])

while True:
    msg = c.poll(1.0)

    if msg is None:
        continue
    if msg.error():
        print("Consumer error: {}".format(msg.error()))
        continue

    print('Received message: {}'.format(msg.value().decode('utf-8')))

c.close()

AvroProducer

from confluent_kafka import avro
from confluent_kafka.avro import AvroProducer


value_schema_str = """
{
   "namespace": "my.test",
   "name": "value",
   "type": "record",
   "fields" : [
     {
       "name" : "name",
       "type" : "string"
     }
   ]
}
"""

key_schema_str = """
{
   "namespace": "my.test",
   "name": "key",
   "type": "record",
   "fields" : [
     {
       "name" : "name",
       "type" : "string"
     }
   ]
}
"""

value_schema = avro.loads(value_schema_str)
key_schema = avro.loads(key_schema_str)
value = {"name": "Value"}
key = {"name": "Key"}


def delivery_report(err, msg):
    """ Called once for each message produced to indicate delivery result.
        Triggered by poll() or flush(). """
    if err is not None:
        print('Message delivery failed: {}'.format(err))
    else:
        print('Message delivered to {} [{}]'.format(msg.topic(), msg.partition()))


avroProducer = AvroProducer({
    'bootstrap.servers': 'mybroker,mybroker2',
    'on_delivery': delivery_report,
    'schema.registry.url': 'http://schema_registry_host:port'
    }, default_key_schema=key_schema, default_value_schema=value_schema)

avroProducer.produce(topic='my_topic', value=value, key=key)
avroProducer.flush()

AvroConsumer

from confluent_kafka.avro import AvroConsumer
from confluent_kafka.avro.serializer import SerializerError


c = AvroConsumer({
    'bootstrap.servers': 'mybroker,mybroker2',
    'group.id': 'groupid',
    'schema.registry.url': 'http://127.0.0.1:8081'})

c.subscribe(['my_topic'])

while True:
    try:
        msg = c.poll(10)

    except SerializerError as e:
        print("Message deserialization failed for {}: {}".format(msg, e))
        break

    if msg is None:
        continue

    if msg.error():
        print("AvroConsumer error: {}".format(msg.error()))
        continue

    print(msg.value())

c.close()

AdminClient

Create topics:

from confluent_kafka.admin import AdminClient, NewTopic

a = AdminClient({'bootstrap.servers': 'mybroker'})

new_topics = [NewTopic(topic, num_partitions=3, replication_factor=1) for topic in ["topic1", "topic2"]]
# Note: In a multi-cluster production scenario, it is more typical to use a replication_factor of 3 for durability.

# Call create_topics to asynchronously create topics. A dict
# of <topic,future> is returned.
fs = a.create_topics(new_topics)

# Wait for each operation to finish.
for topic, f in fs.items():
    try:
        f.result()  # The result itself is None
        print("Topic {} created".format(topic))
    except Exception as e:
        print("Failed to create topic {}: {}".format(topic, e))

Thread Safety

The Producer, Consumer and AdminClient are all thread safe.

Install

Install self-contained binary wheels

$ pip install confluent-kafka

NOTE: The pre-built Linux wheels do NOT contain SASL Kerberos/GSSAPI support. If you need SASL Kerberos/GSSAPI support you must install librdkafka and its dependencies using the repositories below and then build confluent-kafka using the command in the "Install from source from PyPi" section below.

Install AvroProducer and AvroConsumer

$ pip install "confluent-kafka[avro]"

Install from source from PyPi (requires librdkafka + dependencies to be installed separately):

$ pip install --no-binary :all: confluent-kafka

For source install, see Prerequisites below.

Broker Compatibility

The Python client (as well as the underlying C library librdkafka) supports all broker versions >= 0.8. But due to the nature of the Kafka protocol in broker versions 0.8 and 0.9 it is not safe for a client to assume what protocol version is actually supported by the broker, thus you will need to hint the Python client what protocol version it may use. This is done through two configuration settings:

  • broker.version.fallback=YOUR_BROKER_VERSION (default 0.9.0.1)
  • api.version.request=true|false (default true)

When using a Kafka 0.10 broker or later you don't need to do anything (api.version.request=true is the default). If you use Kafka broker 0.9 or 0.8 you must set api.version.request=false and set broker.version.fallback to your broker version, e.g broker.version.fallback=0.9.0.1.

More info here: https://github.com/edenhill/librdkafka/wiki/Broker-version-compatibility

SSL certificates

If you're connecting to a Kafka cluster through SSL you will need to configure the client with 'security.protocol': 'SSL' (or 'SASL_SSL' if SASL authentication is used).

The client will use CA certificates to verify the broker's certificate. The embedded OpenSSL library will look for CA certificates in /usr/lib/ssl/certs/ or /usr/lib/ssl/cacert.pem. CA certificates are typically provided by the Linux distribution's ca-certificates package which needs to be installed through apt, yum, et.al.

If your system stores CA certificates in another location you will need to configure the client with 'ssl.ca.location': '/path/to/cacert.pem'.

Alternatively, the CA certificates can be provided by the certifi Python package. To use certifi, add an import certifi line and configure the client's CA location with 'ssl.ca.location': certifi.where().

Prerequisites

  • Python >= 2.7 or Python 3.x
  • librdkafka >= 1.6.0 (latest release is embedded in wheels)

librdkafka is embedded in the macosx manylinux wheels, for other platforms, SASL Kerberos/GSSAPI support or when a specific version of librdkafka is desired, following these guidelines:

License

Apache License v2.0

KAFKA is a registered trademark of The Apache Software Foundation and has been licensed for use by confluent-kafka-python. confluent-kafka-python has no affiliation with and is not endorsed by The Apache Software Foundation.

Developer Notes

Instructions on building and testing confluent-kafka-python can be found here.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pup-confluent-kafka-1.7.1.tar.gz (107.9 kB view details)

Uploaded Source

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

pup_confluent_kafka-1.7.1-cp310-cp310-manylinux2010_x86_64.whl (10.9 MB view details)

Uploaded CPython 3.10manylinux: glibc 2.12+ x86-64

pup_confluent_kafka-1.7.1-cp39-cp39-manylinux2010_x86_64.whl (10.9 MB view details)

Uploaded CPython 3.9manylinux: glibc 2.12+ x86-64

pup_confluent_kafka-1.7.1-cp39-cp39-macosx_11_0_x86_64.whl (125.5 kB view details)

Uploaded CPython 3.9macOS 11.0+ x86-64

pup_confluent_kafka-1.7.1-cp38-cp38-manylinux2010_x86_64.whl (10.9 MB view details)

Uploaded CPython 3.8manylinux: glibc 2.12+ x86-64

pup_confluent_kafka-1.7.1-cp37-cp37m-manylinux2010_x86_64.whl (10.9 MB view details)

Uploaded CPython 3.7mmanylinux: glibc 2.12+ x86-64

pup_confluent_kafka-1.7.1-cp37-cp37m-macosx_10_9_x86_64.whl (125.0 kB view details)

Uploaded CPython 3.7mmacOS 10.9+ x86-64

pup_confluent_kafka-1.7.1-cp36-cp36m-manylinux2010_x86_64.whl (10.9 MB view details)

Uploaded CPython 3.6mmanylinux: glibc 2.12+ x86-64

File details

Details for the file pup-confluent-kafka-1.7.1.tar.gz.

File metadata

  • Download URL: pup-confluent-kafka-1.7.1.tar.gz
  • Upload date:
  • Size: 107.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.6.0 importlib_metadata/4.8.2 pkginfo/1.8.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.9.6

File hashes

Hashes for pup-confluent-kafka-1.7.1.tar.gz
Algorithm Hash digest
SHA256 18b11696abcdaedb08447818e29442d83207cecbbe6eb3aec82351d2bad77f32
MD5 2fcc9515aa6115dabbea7b89acd97c86
BLAKE2b-256 ad4f0663517a9be1fd5c20ef16e24df799d6b103fe34acae86c1c693bff7e7ea

See more details on using hashes here.

File details

Details for the file pup_confluent_kafka-1.7.1-cp310-cp310-manylinux2010_x86_64.whl.

File metadata

  • Download URL: pup_confluent_kafka-1.7.1-cp310-cp310-manylinux2010_x86_64.whl
  • Upload date:
  • Size: 10.9 MB
  • Tags: CPython 3.10, manylinux: glibc 2.12+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.6.0 importlib_metadata/4.8.2 pkginfo/1.8.1 requests/2.22.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.7.4

File hashes

Hashes for pup_confluent_kafka-1.7.1-cp310-cp310-manylinux2010_x86_64.whl
Algorithm Hash digest
SHA256 a2149aba40cd31682e4c289b42596e589b76fbc3523bb4fbd02c0087ddb68ede
MD5 8b2565fe1c1cd039f3f80d035755d7c9
BLAKE2b-256 2426a08237f4f3af461095eb127c645191a3d30c5e2ef837c321b5704b6c68ec

See more details on using hashes here.

File details

Details for the file pup_confluent_kafka-1.7.1-cp39-cp39-manylinux2010_x86_64.whl.

File metadata

  • Download URL: pup_confluent_kafka-1.7.1-cp39-cp39-manylinux2010_x86_64.whl
  • Upload date:
  • Size: 10.9 MB
  • Tags: CPython 3.9, manylinux: glibc 2.12+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.6.0 importlib_metadata/4.8.2 pkginfo/1.8.1 requests/2.22.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.7.4

File hashes

Hashes for pup_confluent_kafka-1.7.1-cp39-cp39-manylinux2010_x86_64.whl
Algorithm Hash digest
SHA256 08262f922c8676058f0af02fb7811dbd0fc3e5c22cd092ce1ab5565da9d1acad
MD5 7519998677e53245070df11664a1a3d2
BLAKE2b-256 838ecce4b77a0b124fbe4887a49ef62a778dbb64fa71909b52834a863de77a34

See more details on using hashes here.

File details

Details for the file pup_confluent_kafka-1.7.1-cp39-cp39-macosx_11_0_x86_64.whl.

File metadata

  • Download URL: pup_confluent_kafka-1.7.1-cp39-cp39-macosx_11_0_x86_64.whl
  • Upload date:
  • Size: 125.5 kB
  • Tags: CPython 3.9, macOS 11.0+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.6.0 importlib_metadata/4.8.2 pkginfo/1.8.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.9.6

File hashes

Hashes for pup_confluent_kafka-1.7.1-cp39-cp39-macosx_11_0_x86_64.whl
Algorithm Hash digest
SHA256 4041ae82b5ddfaddb1b80f0a0add3f28f15a09f7f2618d4dfcbad495c49ba6f1
MD5 4e3419a41800124a4c279c167c71cd7f
BLAKE2b-256 11af64546fec94dc429297fdcdf2dff43e909aee239b3d54005a50578f286b21

See more details on using hashes here.

File details

Details for the file pup_confluent_kafka-1.7.1-cp38-cp38-manylinux2010_x86_64.whl.

File metadata

  • Download URL: pup_confluent_kafka-1.7.1-cp38-cp38-manylinux2010_x86_64.whl
  • Upload date:
  • Size: 10.9 MB
  • Tags: CPython 3.8, manylinux: glibc 2.12+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.6.0 importlib_metadata/4.8.2 pkginfo/1.8.1 requests/2.22.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.7.4

File hashes

Hashes for pup_confluent_kafka-1.7.1-cp38-cp38-manylinux2010_x86_64.whl
Algorithm Hash digest
SHA256 a612912e769b463949c2a481673d36d6130dfb1068169e620f883ed9e59bc135
MD5 a6e6d4df50e5df003a282c6ae8163763
BLAKE2b-256 ab55c852f5d115c04e3d7264ad0ab451299d6a611f9c226850b3403f03758082

See more details on using hashes here.

File details

Details for the file pup_confluent_kafka-1.7.1-cp37-cp37m-manylinux2010_x86_64.whl.

File metadata

  • Download URL: pup_confluent_kafka-1.7.1-cp37-cp37m-manylinux2010_x86_64.whl
  • Upload date:
  • Size: 10.9 MB
  • Tags: CPython 3.7m, manylinux: glibc 2.12+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.6.0 importlib_metadata/4.8.2 pkginfo/1.8.1 requests/2.22.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.7.4

File hashes

Hashes for pup_confluent_kafka-1.7.1-cp37-cp37m-manylinux2010_x86_64.whl
Algorithm Hash digest
SHA256 31805c292f6ca08da118af066f269aa9f337b94d89b817eac7229034c48cce86
MD5 54391251e48df777e34b521e3f94c4c5
BLAKE2b-256 3623060e33f5e9de53552f98bc2fcefd57e2ecda7d68b8e8ea99ca428690b27e

See more details on using hashes here.

File details

Details for the file pup_confluent_kafka-1.7.1-cp37-cp37m-macosx_10_9_x86_64.whl.

File metadata

  • Download URL: pup_confluent_kafka-1.7.1-cp37-cp37m-macosx_10_9_x86_64.whl
  • Upload date:
  • Size: 125.0 kB
  • Tags: CPython 3.7m, macOS 10.9+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.6.0 importlib_metadata/4.8.2 pkginfo/1.8.1 requests/2.22.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.7.4

File hashes

Hashes for pup_confluent_kafka-1.7.1-cp37-cp37m-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 7aee8df3862013aff929db94256a9f5174f8a34fe9ba6f4875ec924d1eb576dc
MD5 dfe7cfc03989495e1cfd81a16facbce0
BLAKE2b-256 7ed6e35d227dd3b80ae29985ebb9efb1f7e61d44aee527f088809ffdc3570e7e

See more details on using hashes here.

File details

Details for the file pup_confluent_kafka-1.7.1-cp36-cp36m-manylinux2010_x86_64.whl.

File metadata

  • Download URL: pup_confluent_kafka-1.7.1-cp36-cp36m-manylinux2010_x86_64.whl
  • Upload date:
  • Size: 10.9 MB
  • Tags: CPython 3.6m, manylinux: glibc 2.12+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.6.0 importlib_metadata/4.8.2 pkginfo/1.8.1 requests/2.22.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.7.4

File hashes

Hashes for pup_confluent_kafka-1.7.1-cp36-cp36m-manylinux2010_x86_64.whl
Algorithm Hash digest
SHA256 1846fb87f12a43514072cd6f81769865314f74e5d82279db951e1ee7c203a461
MD5 22caef6e11ee165e73fcad15772039c0
BLAKE2b-256 7997e802d8f84bffd7cdd261eb15550b509e08d511bae5055985954c49c2d287

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page