Skip to main content

Confluent's Python client for Apache Kafka

Project description

Confluent Python Client for Apache Kafka

Try Confluent Cloud - The Data Streaming Platform

Confluent's Python Client for Apache KafkaTM

confluent-kafka-python provides a high-level Producer, Consumer and AdminClient compatible with all Apache Kafka™ brokers >= v0.8, Confluent Cloud and Confluent Platform.

Recommended for Production: While this client works with any Kafka deployment, it's optimized for and fully supported with Confluent Cloud (fully managed) and Confluent Platform (self-managed), which provide enterprise-grade security, monitoring, and support.

Why Choose Confluent's Python Client?

Unlike the basic Apache Kafka Python client, confluent-kafka-python provides:

  • Production-Ready Performance: Built on librdkafka (C library) for maximum throughput and minimal latency, significantly outperforming pure Python implementations.
  • Enterprise Features: Schema Registry integration, transactions, exactly-once semantics, and advanced serialization support out of the box.
  • AsyncIO Support: Native async/await support for modern Python applications - not available in the Apache Kafka client.
  • Comprehensive Serialization: Built-in Avro, Protobuf, and JSON Schema support with automatic schema evolution handling.
  • Professional Support: Backed by Confluent's engineering team with enterprise SLAs and 24/7 support options.
  • Active Development: Continuously updated with the latest Kafka features and performance optimizations.
  • Battle-Tested: Used by thousands of organizations in production, from startups to Fortune 500 companies.

Performance Note: The Apache Kafka Python client (kafka-python) is a pure Python implementation that, while functional, has significant performance limitations for high-throughput production use cases. confluent-kafka-python leverages the same high-performance C library (librdkafka) used by Confluent's other clients, providing enterprise-grade performance and reliability.

Key Features

  • High Performance & Reliability: Built on librdkafka, the battle-tested C client for Apache Kafka, ensuring maximum throughput, low latency, and stability. The client is supported by Confluent and is trusted in mission-critical production environments.
  • Comprehensive Kafka Support: Full support for the Kafka protocol, transactions, and administration APIs.
  • Experimental; AsyncIO Producer: An experimental fully asynchronous producer (AIOProducer) for seamless integration with modern Python applications using asyncio.
  • Seamless Schema Registry Integration: Synchronous and asynchronous clients for Confluent Schema Registry to handle schema management and serialization (Avro, Protobuf, JSON Schema).
  • Improved Error Handling: Detailed, context-aware error messages and exceptions to speed up debugging and troubleshooting.
  • [Confluent Cloud] Automatic Zone Detection: Producers automatically connect to brokers in the same availability zone, reducing latency and data transfer costs without requiring manual configuration.
  • [Confluent Cloud] Simplified Configuration Profiles: Pre-defined configuration profiles optimized for common use cases like high throughput or low latency, simplifying client setup.
  • Enterprise Support: Backed by Confluent's expert support team with SLAs and 24/7 assistance for production deployments.

Usage

For a step-by-step guide on using the client, see Getting Started with Apache Kafka and Python.

Choosing Your Kafka Deployment

  • Confluent Cloud - Fully managed service with automatic scaling, security, and monitoring. Best for teams wanting to focus on applications rather than infrastructure.
  • Confluent Platform - Self-managed deployment with enterprise features, support, and tooling. Ideal for on-premises or hybrid cloud requirements.
  • Apache Kafka - Open source deployment. Requires manual setup, monitoring, and maintenance.

Additional examples can be found in the examples directory or the confluentinc/examples GitHub repo, which include demonstrations of:

  • Exactly once data processing using the transactional API.
  • Integration with asyncio.
  • (De)serializing Protobuf, JSON, and Avro data with Confluent Schema Registry integration.
  • Confluent Cloud configuration.

Also see the Python client docs and the API reference.

Finally, the tests are useful as a reference for example usage.

AsyncIO Producer (experimental)

Use the AsyncIO Producer inside async applications to avoid blocking the event loop.

import asyncio
from confluent_kafka.experimental.aio import AIOProducer

async def main():
    p = AIOProducer({"bootstrap.servers": "mybroker"})
    try:
        # produce() returns a Future; first await the coroutine to get the Future,
        # then await the Future to get the delivered Message.
        delivery_future = await p.produce("mytopic", value=b"hello")
        delivered_msg = await delivery_future
        # Optionally flush any remaining buffered messages before shutdown
        await p.flush()
    finally:
        await p.close()

asyncio.run(main())

Notes:

  • Batched async produce buffers messages; delivery callbacks, stats, errors, and logger run on the event loop.
  • Per-message headers are not supported in the batched async path. If headers are required, use the synchronous Producer.produce(...) (you can offload to a thread in async apps).

For a more detailed example that includes both an async producer and consumer, see examples/asyncio_example.py.

Architecture: For implementation details and component architecture, see the AIOProducer Architecture Overview.

When to use AsyncIO vs synchronous Producer

  • Use AsyncIO Producer when your code runs under an event loop (FastAPI/Starlette, aiohttp, Sanic, asyncio workers) and must not block.
  • Use synchronous Producer for scripts, batch jobs, and highest-throughput pipelines where you control threads/processes and can call poll()/flush() directly.
  • In async servers, prefer AsyncIO Producer; if you need headers, call sync produce() via run_in_executor for that path.

AsyncIO with Schema Registry

The AsyncIO producer and consumer integrate seamlessly with async Schema Registry serializers. See the Schema Registry Integration section below for full details.

Basic Producer example

from confluent_kafka import Producer

p = Producer({'bootstrap.servers': 'mybroker1,mybroker2'})

def delivery_report(err, msg):
    """ Called once for each message produced to indicate delivery result.
        Triggered by poll() or flush(). """
    if err is not None:
        print('Message delivery failed: {}'.format(err))
    else:
        print('Message delivered to {} [{}]'.format(msg.topic(), msg.partition()))

for data in some_data_source:
    # Trigger any available delivery report callbacks from previous produce() calls
    p.poll(0)

    # Asynchronously produce a message. The delivery report callback will
    # be triggered from the call to poll() above, or flush() below, when the
    # message has been successfully delivered or failed permanently.
    p.produce('mytopic', data.encode('utf-8'), callback=delivery_report)

# Wait for any outstanding messages to be delivered and delivery report
# callbacks to be triggered.
p.flush()

For a discussion on the poll based producer API, refer to the Integrating Apache Kafka With Python Asyncio Web Applications blog post.

Schema Registry Integration

This client provides full integration with Schema Registry for schema management and message serialization, and is compatible with both Confluent Platform and Confluent Cloud. Both synchronous and asynchronous clients are available.

Learn more

Synchronous Client & Serializers

Use the synchronous SchemaRegistryClient with the standard Producer and Consumer.

from confluent_kafka import Producer
from confluent_kafka.schema_registry import SchemaRegistryClient
from confluent_kafka.schema_registry.avro import AvroSerializer
from confluent_kafka.serialization import StringSerializer, SerializationContext, MessageField

# Configure Schema Registry Client
schema_registry_conf = {'url': 'http://localhost:8081'}  # Confluent Platform
# For Confluent Cloud, add: 'basic.auth.user.info': '<sr-api-key>:<sr-api-secret>'
# See: https://docs.confluent.io/cloud/current/sr/index.html
schema_registry_client = SchemaRegistryClient(schema_registry_conf)

# 2. Configure AvroSerializer
avro_serializer = AvroSerializer(schema_registry_client,
                                 user_schema_str,
                                 lambda user, ctx: user.to_dict())

# 3. Configure Producer
producer_conf = {
    'bootstrap.servers': 'localhost:9092',
    'key.serializer': StringSerializer('utf_8'),
    'value.serializer': avro_serializer
}
producer = Producer(producer_conf)

# 4. Produce messages
producer.produce('my-topic', key='user1', value=some_user_object)
producer.flush()

Asynchronous Client & Serializers (AsyncIO)

Use the AsyncSchemaRegistryClient and Async serializers with AIOProducer and AIOConsumer. The configuration is the same as the synchronous client.

from confluent_kafka.experimental.aio import AIOProducer
from confluent_kafka.schema_registry import AsyncSchemaRegistryClient
from confluent_kafka.schema_registry._async.avro import AsyncAvroSerializer

# Setup async Schema Registry client and serializer
# (See configuration options in the synchronous example above)
schema_registry_conf = {'url': 'http://localhost:8081'}
schema_client = AsyncSchemaRegistryClient(schema_registry_conf)
serializer = await AsyncAvroSerializer(schema_client, schema_str=avro_schema)

# Use with AsyncIO producer
producer = AIOProducer({"bootstrap.servers": "localhost:9092"})
serialized_value = await serializer(data, SerializationContext("topic", MessageField.VALUE))
delivery_future = await producer.produce("topic", value=serialized_value)

Available async serializers: AsyncAvroSerializer, AsyncJSONSerializer, AsyncProtobufSerializer (and corresponding deserializers).

See also:

Import paths

from confluent_kafka.schema_registry._async.avro import AsyncAvroSerializer, AsyncAvroDeserializer
from confluent_kafka.schema_registry._async.json_schema import AsyncJSONSerializer, AsyncJSONDeserializer
from confluent_kafka.schema_registry._async.protobuf import AsyncProtobufSerializer, AsyncProtobufDeserializer

Client-Side Field Level Encryption (CSFLE): To use Data Contracts rules (including CSFLE), install the rules extra (see Install section), and refer to the encryption examples in examples/README.md. For CSFLE-specific guidance, see the Confluent Cloud CSFLE documentation.

Note: The async Schema Registry interface mirrors the synchronous client exactly - same configuration options, same calling patterns, no unexpected gotchas or limitations. Simply add await to method calls and use the Async prefixed classes.

Troubleshooting

  • 401/403 Unauthorized when using Confluent Cloud: Verify your basic.auth.user.info (SR API key/secret) is correct and that the Schema Registry URL is for your specific cluster. Ensure you are using an SR API key, not a Kafka API key.
  • Schema not found: Check that your subject.name.strategy configuration matches how your schemas are registered in Schema Registry, and that the topic and message field (key/value) pairing is correct.

Basic Consumer example

from confluent_kafka import Consumer

c = Consumer({
    'bootstrap.servers': 'mybroker',
    'group.id': 'mygroup',
    'auto.offset.reset': 'earliest'
})

c.subscribe(['mytopic'])

while True:
    msg = c.poll(1.0)

    if msg is None:
        continue
    if msg.error():
        print("Consumer error: {}".format(msg.error()))
        continue

    print('Received message: {}'.format(msg.value().decode('utf-8')))

c.close()

Basic AdminClient example

Create topics:

from confluent_kafka.admin import AdminClient, NewTopic

a = AdminClient({'bootstrap.servers': 'mybroker'})

new_topics = [NewTopic(topic, num_partitions=3, replication_factor=1) for topic in ["topic1", "topic2"]]
# Note: In a multi-cluster production scenario, it is more typical to use a replication_factor of 3 for durability.

# Call create_topics to asynchronously create topics. A dict
# of <topic,future> is returned.
fs = a.create_topics(new_topics)

# Wait for each operation to finish.
for topic, f in fs.items():
    try:
        f.result()  # The result itself is None
        print("Topic {} created".format(topic))
    except Exception as e:
        print("Failed to create topic {}: {}".format(topic, e))

Thread safety

The Producer, Consumer, and AdminClient are all thread safe.

Install

# Basic installation
pip install confluent-kafka

# With Schema Registry support
pip install "confluent-kafka[avro,schemaregistry]"     # Avro
pip install "confluent-kafka[json,schemaregistry]"     # JSON Schema  
pip install "confluent-kafka[protobuf,schemaregistry]" # Protobuf

# With Data Contract rules (includes CSFLE support)
pip install "confluent-kafka[avro,schemaregistry,rules]"

Note: Pre-built Linux wheels do not include SASL Kerberos/GSSAPI support. For Kerberos, see the source installation instructions in INSTALL.md. To use Schema Registry with the Avro serializer/deserializer:

pip install "confluent-kafka[avro,schemaregistry]"

To use Schema Registry with the JSON serializer/deserializer:

pip install "confluent-kafka[json,schemaregistry]"

To use Schema Registry with the Protobuf serializer/deserializer:

pip install "confluent-kafka[protobuf,schemaregistry]"

When using Data Contract rules (including CSFLE) add the rulesextra, e.g.:

pip install "confluent-kafka[avro,schemaregistry,rules]"

Install from source

For source install, see the Install from source section in INSTALL.md.

Broker compatibility

The Python client (as well as the underlying C library librdkafka) supports all broker versions >= 0.8. But due to the nature of the Kafka protocol in broker versions 0.8 and 0.9 it is not safe for a client to assume what protocol version is actually supported by the broker, thus you will need to hint the Python client what protocol version it may use. This is done through two configuration settings:

  • broker.version.fallback=YOUR_BROKER_VERSION (default 0.9.0.1)
  • api.version.request=true|false (default true)

When using a Kafka 0.10 broker or later you don't need to do anything

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

confluent_kafka-2.12.0rc2.tar.gz (250.4 kB view details)

Uploaded Source

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

confluent_kafka-2.12.0rc2-cp313-cp313-win_amd64.whl (4.1 MB view details)

Uploaded CPython 3.13Windows x86-64

confluent_kafka-2.12.0rc2-cp313-cp313-manylinux_2_28_x86_64.whl (3.9 MB view details)

Uploaded CPython 3.13manylinux: glibc 2.28+ x86-64

confluent_kafka-2.12.0rc2-cp313-cp313-manylinux_2_28_aarch64.whl (3.7 MB view details)

Uploaded CPython 3.13manylinux: glibc 2.28+ ARM64

confluent_kafka-2.12.0rc2-cp313-cp313-macosx_15_0_x86_64.whl (3.6 MB view details)

Uploaded CPython 3.13macOS 15.0+ x86-64

confluent_kafka-2.12.0rc2-cp313-cp313-macosx_15_0_arm64.whl (3.2 MB view details)

Uploaded CPython 3.13macOS 15.0+ ARM64

confluent_kafka-2.12.0rc2-cp312-cp312-win_amd64.whl (4.1 MB view details)

Uploaded CPython 3.12Windows x86-64

confluent_kafka-2.12.0rc2-cp312-cp312-manylinux_2_28_x86_64.whl (3.9 MB view details)

Uploaded CPython 3.12manylinux: glibc 2.28+ x86-64

confluent_kafka-2.12.0rc2-cp312-cp312-manylinux_2_28_aarch64.whl (3.7 MB view details)

Uploaded CPython 3.12manylinux: glibc 2.28+ ARM64

confluent_kafka-2.12.0rc2-cp312-cp312-macosx_11_0_arm64.whl (3.2 MB view details)

Uploaded CPython 3.12macOS 11.0+ ARM64

confluent_kafka-2.12.0rc2-cp312-cp312-macosx_10_9_x86_64.whl (3.6 MB view details)

Uploaded CPython 3.12macOS 10.9+ x86-64

confluent_kafka-2.12.0rc2-cp311-cp311-win_amd64.whl (4.1 MB view details)

Uploaded CPython 3.11Windows x86-64

confluent_kafka-2.12.0rc2-cp311-cp311-manylinux_2_28_x86_64.whl (3.9 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.28+ x86-64

confluent_kafka-2.12.0rc2-cp311-cp311-manylinux_2_28_aarch64.whl (3.7 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.28+ ARM64

confluent_kafka-2.12.0rc2-cp311-cp311-macosx_11_0_arm64.whl (3.2 MB view details)

Uploaded CPython 3.11macOS 11.0+ ARM64

confluent_kafka-2.12.0rc2-cp311-cp311-macosx_10_9_x86_64.whl (3.6 MB view details)

Uploaded CPython 3.11macOS 10.9+ x86-64

confluent_kafka-2.12.0rc2-cp310-cp310-win_amd64.whl (4.1 MB view details)

Uploaded CPython 3.10Windows x86-64

confluent_kafka-2.12.0rc2-cp310-cp310-manylinux_2_28_x86_64.whl (3.9 MB view details)

Uploaded CPython 3.10manylinux: glibc 2.28+ x86-64

confluent_kafka-2.12.0rc2-cp310-cp310-manylinux_2_28_aarch64.whl (3.7 MB view details)

Uploaded CPython 3.10manylinux: glibc 2.28+ ARM64

confluent_kafka-2.12.0rc2-cp310-cp310-macosx_11_0_arm64.whl (3.2 MB view details)

Uploaded CPython 3.10macOS 11.0+ ARM64

confluent_kafka-2.12.0rc2-cp310-cp310-macosx_10_9_x86_64.whl (3.6 MB view details)

Uploaded CPython 3.10macOS 10.9+ x86-64

confluent_kafka-2.12.0rc2-cp39-cp39-win_amd64.whl (4.1 MB view details)

Uploaded CPython 3.9Windows x86-64

confluent_kafka-2.12.0rc2-cp39-cp39-manylinux_2_28_x86_64.whl (3.9 MB view details)

Uploaded CPython 3.9manylinux: glibc 2.28+ x86-64

confluent_kafka-2.12.0rc2-cp39-cp39-manylinux_2_28_aarch64.whl (3.7 MB view details)

Uploaded CPython 3.9manylinux: glibc 2.28+ ARM64

confluent_kafka-2.12.0rc2-cp39-cp39-macosx_11_0_arm64.whl (3.2 MB view details)

Uploaded CPython 3.9macOS 11.0+ ARM64

confluent_kafka-2.12.0rc2-cp39-cp39-macosx_10_9_x86_64.whl (3.6 MB view details)

Uploaded CPython 3.9macOS 10.9+ x86-64

confluent_kafka-2.12.0rc2-cp38-cp38-win_amd64.whl (4.1 MB view details)

Uploaded CPython 3.8Windows x86-64

confluent_kafka-2.12.0rc2-cp38-cp38-manylinux_2_28_x86_64.whl (4.1 MB view details)

Uploaded CPython 3.8manylinux: glibc 2.28+ x86-64

confluent_kafka-2.12.0rc2-cp38-cp38-manylinux_2_28_aarch64.whl (3.9 MB view details)

Uploaded CPython 3.8manylinux: glibc 2.28+ ARM64

confluent_kafka-2.12.0rc2-cp38-cp38-macosx_11_0_arm64.whl (3.2 MB view details)

Uploaded CPython 3.8macOS 11.0+ ARM64

confluent_kafka-2.12.0rc2-cp38-cp38-macosx_10_9_x86_64.whl (3.6 MB view details)

Uploaded CPython 3.8macOS 10.9+ x86-64

confluent_kafka-2.12.0rc2-cp37-cp37m-win_amd64.whl (4.1 MB view details)

Uploaded CPython 3.7mWindows x86-64

confluent_kafka-2.12.0rc2-cp37-cp37m-manylinux_2_28_x86_64.whl (4.1 MB view details)

Uploaded CPython 3.7mmanylinux: glibc 2.28+ x86-64

confluent_kafka-2.12.0rc2-cp37-cp37m-manylinux_2_28_aarch64.whl (3.9 MB view details)

Uploaded CPython 3.7mmanylinux: glibc 2.28+ ARM64

confluent_kafka-2.12.0rc2-cp37-cp37m-macosx_10_9_x86_64.whl (3.6 MB view details)

Uploaded CPython 3.7mmacOS 10.9+ x86-64

File details

Details for the file confluent_kafka-2.12.0rc2.tar.gz.

File metadata

  • Download URL: confluent_kafka-2.12.0rc2.tar.gz
  • Upload date:
  • Size: 250.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.3

File hashes

Hashes for confluent_kafka-2.12.0rc2.tar.gz
Algorithm Hash digest
SHA256 18ee4856163e4f4f61b4b50e399c50eb03787b4506b87c297a1103d65ef90a1d
MD5 3e4f5181238627bded1550506e789cec
BLAKE2b-256 b8c716998a871067cfcd8c4df32d08158e0648b596a4953b8855baa67f302ad9

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0rc2-cp313-cp313-win_amd64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0rc2-cp313-cp313-win_amd64.whl
Algorithm Hash digest
SHA256 39e9023a5073243c076aba9c2e1b78fc18be5ce1441adabe572bf9b5e3bc005f
MD5 585f88928d0d5191e499b6652696ad56
BLAKE2b-256 6a50ef3a574dccb068517459fa865a3a0ebfe52cae56902633fb8fd7cc13b348

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0rc2-cp313-cp313-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0rc2-cp313-cp313-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 d862976faca15b434db3016c0ac2848e3b74a85825c57b04c7e431a6911a2332
MD5 f7cd2fde86b0ac4e4786330ecb410bb6
BLAKE2b-256 7b27ea65dfb4ed364d6925e51f8b22caf68b25daa72a1823b1da2309ce28f97b

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0rc2-cp313-cp313-manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0rc2-cp313-cp313-manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 1b2a9914a165dc7d822edaa2c7a309e72c940556b4ee89fb2781d3c9e7c6428f
MD5 ee2a32b2995039e7f59aef31c3e4fee7
BLAKE2b-256 ee384aa19ed8ad8b909e1f8c7a1753fbf723ea867f7870f0d30413614b82072c

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0rc2-cp313-cp313-macosx_15_0_x86_64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0rc2-cp313-cp313-macosx_15_0_x86_64.whl
Algorithm Hash digest
SHA256 e8a5b55911922da105a82412b62d8b99d7aba8e77f6be3dfce47848ad8dccb3d
MD5 90a0ad73ed8b5209e36f32c943ea0798
BLAKE2b-256 81e6e9b9802988b5732cfafb00a48684a243234635ec610512797a3ba1fd4939

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0rc2-cp313-cp313-macosx_15_0_arm64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0rc2-cp313-cp313-macosx_15_0_arm64.whl
Algorithm Hash digest
SHA256 4031beea2705156700ac3d5f3e1bf5a7f70d2c30dd677c06c2ac58bef5b869b2
MD5 4e94b2c3844b7ec744b646fa9f734be2
BLAKE2b-256 c5d93864d19d7f4fc5fc0b0dc0d9f450e2f7e06957b3966870dc66011f7681b7

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0rc2-cp312-cp312-win_amd64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0rc2-cp312-cp312-win_amd64.whl
Algorithm Hash digest
SHA256 a8850c59836753f52dd6645a045a287870fb7337824159e73bdb9fa3892a94a9
MD5 a472e15101ca57e3abac42d564d3765b
BLAKE2b-256 7749cb6ecd948ea2c3596c03014127997ed2b9c5207b289daaa5a119ba003a40

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0rc2-cp312-cp312-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0rc2-cp312-cp312-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 2eb4c8f91a851c3a70f9f04e95a6a8a9f73fa5016a20907f97d10b578a7854fe
MD5 420519a01dc0b87d4fffae85652c542f
BLAKE2b-256 c7c9bb8c63ed464a68a59d8dcc9a17f91516712fe33f497646fccff3d545b59f

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0rc2-cp312-cp312-manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0rc2-cp312-cp312-manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 39f5b0b152e911b0a38c012249a559db9ccaccb34fed2f984412a8957a419dd7
MD5 23f1bf345a9154d4a18f4195b8c78a70
BLAKE2b-256 f3144d2431ffe136f79db25f595760fe1c008037e852e976bc1b4489830783e0

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0rc2-cp312-cp312-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0rc2-cp312-cp312-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 8becdcdf152e2b1f86ade5b7d63e92c3f069caabba6435405860170f13780639
MD5 81fc0bd4fd7201fa26a732bc865233bc
BLAKE2b-256 2c897308f7f906137d76468fa28d2efc67341dcd3eabaa160baac19b40227e2b

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0rc2-cp312-cp312-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0rc2-cp312-cp312-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 d209bc897646173971ea3ae4004170e9142a4b530b0ce3b3f7ba96168f9ca3d5
MD5 48e3c44fd2708bd2e40ef1d688c1b031
BLAKE2b-256 94cb9ea91336e29d9b01dab7169edcec490b6c849eb9a3eda886ec3908d16958

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0rc2-cp311-cp311-win_amd64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0rc2-cp311-cp311-win_amd64.whl
Algorithm Hash digest
SHA256 ef58d489f7f8e08788c47da00667ed1dd01d012373b9a85d620390b8d7778724
MD5 784180b5631b46d50a9ab5337dd50fe0
BLAKE2b-256 ef406558ddf0bd5d3b1bc37382885cacc14d4fa39f4aabb9fe40c7e166fd2325

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0rc2-cp311-cp311-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0rc2-cp311-cp311-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 d4f35b5f55c29fefa925383a0c92a070155331db3a31846239fe26d893360f6d
MD5 82f59707eadfb1370c1890939a354661
BLAKE2b-256 9ec0ba53e14c65c67219c290415c8dee3008583c64fb93b178dd8334afad24d6

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0rc2-cp311-cp311-manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0rc2-cp311-cp311-manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 8b549ae6af88f1427b3adefe15b664cd044d79fa1723f669c9ff67880e878f05
MD5 3060c1032b31f3a89300ba5d72e35c01
BLAKE2b-256 e7c24c7d8d63b0fe8ebed00915b652ba661ada3349a9d04bfbd3239557a2bb81

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0rc2-cp311-cp311-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0rc2-cp311-cp311-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 0d30d38759de17a00ae284e38da9763ed8a5d8adf330e6c9090068b609c0b430
MD5 487a9815372d6fc01a06c146b8485ab6
BLAKE2b-256 010200d18f4256d2e01d11faa90f95e839926cf49f7a328f959a4c2fadfa42b1

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0rc2-cp311-cp311-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0rc2-cp311-cp311-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 2bf77e4aeffdde4bf07b042ae9c04f91b6771bbb656c4ce78c0617f74802f0ca
MD5 1b8c6748040dee0c2af9cc8fbebfd939
BLAKE2b-256 7ea506e76d7c7d3108763d8bcb8378ca563d4577a1badf10528dc85dd2252312

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0rc2-cp310-cp310-win_amd64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0rc2-cp310-cp310-win_amd64.whl
Algorithm Hash digest
SHA256 d5b7be9b3a2ec942a44cb99f38e15b540d451b6e85c1c384d83b8354d13ef538
MD5 ec011ce31150e74cb7af425f62b89b0c
BLAKE2b-256 c9b30b4721e23260f3cc063456a29a20dee1539c50b2ff980f2674d2e708a230

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0rc2-cp310-cp310-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0rc2-cp310-cp310-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 3f5f061fba6861a4e773da3bf1b3b9d9c80081e2423ebe7fd6d84d67f52f3682
MD5 7a0069829ca84da9428da04eb8595d03
BLAKE2b-256 cffc9581cbb55fd868e75e8a494dcd6a26619831113ed2d5f3aafd960083dbbe

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0rc2-cp310-cp310-manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0rc2-cp310-cp310-manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 9e976fb3166400210215b51d8bcd608641ad2bc557f14b23882c7397f312cf90
MD5 a6b6cd0c8becbec6a53dd8b98c502096
BLAKE2b-256 9d8bf7f92567f2005de3c4984beafdf682009222fbee61700a98b089c5f886ca

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0rc2-cp310-cp310-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0rc2-cp310-cp310-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 bf16104033331249dc40e480fc70258660b35c6e3b586155ebaa1eb8754f47ac
MD5 d4f412bc243d56fb6a3071cf563f7c34
BLAKE2b-256 6759aabaa1aa6aac5fffb3f8ea173e9e9f49bc28dddb48fbe1b599b6d9686921

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0rc2-cp310-cp310-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0rc2-cp310-cp310-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 8ab54d71a64fd030eb416e59a94d60f04d6dc81162f8201edcb8280ee7d222fa
MD5 3db34382d9b6e5fb1b16ec6484d93c72
BLAKE2b-256 81aa0f373684fa323d7a355ff865e10bbb42d7db093faea15d6c4c3fce162ae3

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0rc2-cp39-cp39-win_amd64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0rc2-cp39-cp39-win_amd64.whl
Algorithm Hash digest
SHA256 ac15a4e585b1352ef4df2651eee7b26d53c6b05b2c78ad980531977f85a071da
MD5 ecc18b8c9562de3199e66c80e5f03242
BLAKE2b-256 84ea1e61b32bdc88e22ad34970c1821b86fedc86d6b4e008b2bbc5e714721362

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0rc2-cp39-cp39-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0rc2-cp39-cp39-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 2938082b3ff50cf1e09d7b2c72cd2e1fcddf05a7eb7ca3117523b660bddc9faf
MD5 2e63c1458933f943b8f7507f680c8c6d
BLAKE2b-256 28c322e7f3dd8247cac191e0caf252e8c0dbe2034ccd4155d4fe1c0d16846dcf

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0rc2-cp39-cp39-manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0rc2-cp39-cp39-manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 1705fcc9bef2c538b62fe8de368f47e79b2364d485225d85ded52521b94ac944
MD5 74f78d9d480b7b09e0614935bc0b5934
BLAKE2b-256 bcc599e074e2a1da5909fa6a7cedb7fab70b24caba187056cbee9636966c59c4

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0rc2-cp39-cp39-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0rc2-cp39-cp39-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 ba897c49afd01c1a58fb268047f8ef446a7be08e29f62d2554dc80c75c3fa8f5
MD5 3f388168abc256a5e8ee9992c3d63722
BLAKE2b-256 6944ddfe97d5f8b34559ab89383c1330e6461016fdaefbdc6667778f292fb98b

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0rc2-cp39-cp39-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0rc2-cp39-cp39-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 bf4d460e74f4c60ac64161be69b29dd2494c2605f251441304d8bdb8c9b3f3d5
MD5 7a538595cc45700ea4fd2b897b1805e3
BLAKE2b-256 0d64dbd277628e378fa78c406f7fbd0380ed3ead4cbbc8f1d0903ce96cf39600

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0rc2-cp38-cp38-win_amd64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0rc2-cp38-cp38-win_amd64.whl
Algorithm Hash digest
SHA256 18a11e0367f8345ecef44bd55b29faf0e8e19055781add1a31ee1055333ab843
MD5 28f7f3b0d31127ab3193980a431f5c60
BLAKE2b-256 24d5ab71bde150f302ef14edcc29bbf4362f2a8717a341deb2077a70c7951ecd

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0rc2-cp38-cp38-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0rc2-cp38-cp38-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 47ee6518812da501b0384e55124323753cd580365df90930f61bf295c5d5dc0a
MD5 a9d38565424a40ae5d22f7a6e29720d7
BLAKE2b-256 2f7cbbcb5dfacb3cd875391ed9bc193f3e8e8adba697a715572e31097b662ea5

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0rc2-cp38-cp38-manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0rc2-cp38-cp38-manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 20947aa16d351bc71e13a78e612382fbc89344fe8d3e5c697a62fcbdd1033468
MD5 666bde6710e4b219a23874fbfdb7228a
BLAKE2b-256 ad7decc8adef418c2b3b4bb0977f3d0981709ccd4c0afc3c049c09a88bbb0c57

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0rc2-cp38-cp38-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0rc2-cp38-cp38-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 d59d4ff8f9eb918075d84360ef02a7ad866483ff1b37711b4ef0be39a7d672be
MD5 504bc234f7de045872cf89f82e8ea503
BLAKE2b-256 a8ae0f9df25228330dee2673e4a266c618bcc0ea88a5e08134976cd96f3ba3a0

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0rc2-cp38-cp38-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0rc2-cp38-cp38-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 fa2d98279f02dc1a0b9d6de02a09540abf2e58944645549cec237cb344a858b1
MD5 a68539b4bd4553d77af20803fd6cf0c9
BLAKE2b-256 73a8a7e1a537be360591e404280ade52ce2fc89950ee34fb4b688a25d647fb5d

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0rc2-cp37-cp37m-win_amd64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0rc2-cp37-cp37m-win_amd64.whl
Algorithm Hash digest
SHA256 90098d6eb537b0f182db9796c7955a3a98dfa00245dc019616133f77afa72e64
MD5 3c910571c17d74085caf630c718dbd95
BLAKE2b-256 433d092dcf04eae3fbb86d430b91a307202b4ba065041de9df884ad87ee8ba69

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0rc2-cp37-cp37m-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0rc2-cp37-cp37m-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 9310a9644533d96839bdc61bf8f70ab73e33c5ac887131473bc65e389c2f0b90
MD5 5a7c473601674dc4f814ef4c0c7836ae
BLAKE2b-256 d5d8e900828cb825504967bb9f765ab6cf21d44be18727400c207c8a3ddec96b

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0rc2-cp37-cp37m-manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0rc2-cp37-cp37m-manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 b4659dd510ea3b3911f5aed203a39acebaf463aef9ae16e920e4b379a77ea367
MD5 ba995609d63675820e67cc0bbdbeefca
BLAKE2b-256 6000968565787e45bda215c73f37d4a4b84ab42ada4d0a7e81f468c8cae014cc

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0rc2-cp37-cp37m-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0rc2-cp37-cp37m-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 cae741d36cb3ed8f340ea5458413f29461de5c629d73b8eeffade9e7573efd07
MD5 1836733160c91592830b0ef013cf21ec
BLAKE2b-256 b36b716aa0cdb7713044a6de8e9fa7c3ff83c5dd079ee3f848282ce89bedf85a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page