Skip to main content

Confluent's Python client for Apache Kafka

Project description

Confluent Python Client for Apache Kafka

Try Confluent Cloud - The Data Streaming Platform

Confluent's Python Client for Apache KafkaTM

confluent-kafka-python provides a high-level Producer, Consumer and AdminClient compatible with all Apache Kafka™ brokers >= v0.8, Confluent Cloud and Confluent Platform.

Recommended for Production: While this client works with any Kafka deployment, it's optimized for and fully supported with Confluent Cloud (fully managed) and Confluent Platform (self-managed), which provide enterprise-grade security, monitoring, and support.

Why Choose Confluent's Python Client?

Unlike the basic Apache Kafka Python client, confluent-kafka-python provides:

  • Production-Ready Performance: Built on librdkafka (C library) for maximum throughput and minimal latency, significantly outperforming pure Python implementations.
  • Enterprise Features: Schema Registry integration, transactions, exactly-once semantics, and advanced serialization support out of the box.
  • AsyncIO Support: Native async/await support for modern Python applications - not available in the Apache Kafka client.
  • Comprehensive Serialization: Built-in Avro, Protobuf, and JSON Schema support with automatic schema evolution handling.
  • Professional Support: Backed by Confluent's engineering team with enterprise SLAs and 24/7 support options.
  • Active Development: Continuously updated with the latest Kafka features and performance optimizations.
  • Battle-Tested: Used by thousands of organizations in production, from startups to Fortune 500 companies.

Performance Note: The Apache Kafka Python client (kafka-python) is a pure Python implementation that, while functional, has significant performance limitations for high-throughput production use cases. confluent-kafka-python leverages the same high-performance C library (librdkafka) used by Confluent's other clients, providing enterprise-grade performance and reliability.

Key Features

  • High Performance & Reliability: Built on librdkafka, the battle-tested C client for Apache Kafka, ensuring maximum throughput, low latency, and stability. The client is supported by Confluent and is trusted in mission-critical production environments.
  • Comprehensive Kafka Support: Full support for the Kafka protocol, transactions, and administration APIs.
  • AsyncIO Producer: A fully asynchronous producer (AIOProducer) for seamless integration with modern Python applications using asyncio.
  • Seamless Schema Registry Integration: Synchronous and asynchronous clients for Confluent Schema Registry to handle schema management and serialization (Avro, Protobuf, JSON Schema).
  • Improved Error Handling: Detailed, context-aware error messages and exceptions to speed up debugging and troubleshooting.
  • [Confluent Cloud] Automatic Zone Detection: Producers automatically connect to brokers in the same availability zone, reducing latency and data transfer costs without requiring manual configuration.
  • [Confluent Cloud] Simplified Configuration Profiles: Pre-defined configuration profiles optimized for common use cases like high throughput or low latency, simplifying client setup.
  • Enterprise Support: Backed by Confluent's expert support team with SLAs and 24/7 assistance for production deployments.

Usage

For a step-by-step guide on using the client, see Getting Started with Apache Kafka and Python.

Choosing Your Kafka Deployment

  • Confluent Cloud - Fully managed service with automatic scaling, security, and monitoring. Best for teams wanting to focus on applications rather than infrastructure.
  • Confluent Platform - Self-managed deployment with enterprise features, support, and tooling. Ideal for on-premises or hybrid cloud requirements.
  • Apache Kafka - Open source deployment. Requires manual setup, monitoring, and maintenance.

Additional examples can be found in the examples directory or the confluentinc/examples GitHub repo, which include demonstrations of:

  • Exactly once data processing using the transactional API.
  • Integration with asyncio.
  • (De)serializing Protobuf, JSON, and Avro data with Confluent Schema Registry integration.
  • Confluent Cloud configuration.

Also see the Python client docs and the API reference.

Finally, the tests are useful as a reference for example usage.

AsyncIO Producer (experimental)

Use the AsyncIO Producer inside async applications to avoid blocking the event loop.

import asyncio
from confluent_kafka.aio import AIOProducer

async def main():
    p = AIOProducer({"bootstrap.servers": "mybroker"})
    try:
        # produce() returns a Future; first await the coroutine to get the Future,
        # then await the Future to get the delivered Message.
        delivery_future = await p.produce("mytopic", value=b"hello")
        delivered_msg = await delivery_future
        # Optionally flush any remaining buffered messages before shutdown
        await p.flush()
    finally:
        await p.close()

asyncio.run(main())

Notes:

  • Batched async produce buffers messages; delivery callbacks, stats, errors, and logger run on the event loop.
  • Per-message headers are not supported in the batched async path. If headers are required, use the synchronous Producer.produce(...) (you can offload to a thread in async apps).

For a more detailed example that includes both an async producer and consumer, see examples/asyncio_example.py.

Architecture: For implementation details and component architecture, see the AIOProducer Architecture Overview.

When to use AsyncIO vs synchronous Producer

  • Use AsyncIO Producer when your code runs under an event loop (FastAPI/Starlette, aiohttp, Sanic, asyncio workers) and must not block.
  • Use synchronous Producer for scripts, batch jobs, and highest-throughput pipelines where you control threads/processes and can call poll()/flush() directly.
  • In async servers, prefer AsyncIO Producer; if you need headers, call sync produce() via run_in_executor for that path.

AsyncIO with Schema Registry

The AsyncIO producer and consumer integrate seamlessly with async Schema Registry serializers. See the Schema Registry Integration section below for full details.

Migration Note: If you previously used custom AsyncIO wrappers, you can now migrate to the official AIOProducer which handles thread pool management, callback scheduling, and cleanup automatically. See the blog post for migration guidance.

Basic Producer example

from confluent_kafka import Producer

p = Producer({'bootstrap.servers': 'mybroker1,mybroker2'})

def delivery_report(err, msg):
    """ Called once for each message produced to indicate delivery result.
        Triggered by poll() or flush(). """
    if err is not None:
        print('Message delivery failed: {}'.format(err))
    else:
        print('Message delivered to {} [{}]'.format(msg.topic(), msg.partition()))

for data in some_data_source:
    # Trigger any available delivery report callbacks from previous produce() calls
    p.poll(0)

    # Asynchronously produce a message. The delivery report callback will
    # be triggered from the call to poll() above, or flush() below, when the
    # message has been successfully delivered or failed permanently.
    p.produce('mytopic', data.encode('utf-8'), callback=delivery_report)

# Wait for any outstanding messages to be delivered and delivery report
# callbacks to be triggered.
p.flush()

For a discussion on the poll based producer API, refer to the Integrating Apache Kafka With Python Asyncio Web Applications blog post.

Schema Registry Integration

This client provides full integration with Schema Registry for schema management and message serialization, and is compatible with both Confluent Platform and Confluent Cloud. Both synchronous and asynchronous clients are available.

Learn more

Synchronous Client & Serializers

Use the synchronous SchemaRegistryClient with the standard Producer and Consumer.

from confluent_kafka import Producer
from confluent_kafka.schema_registry import SchemaRegistryClient
from confluent_kafka.schema_registry.avro import AvroSerializer
from confluent_kafka.serialization import StringSerializer, SerializationContext, MessageField

# Configure Schema Registry Client
schema_registry_conf = {'url': 'http://localhost:8081'}  # Confluent Platform
# For Confluent Cloud, add: 'basic.auth.user.info': '<sr-api-key>:<sr-api-secret>'
# See: https://docs.confluent.io/cloud/current/sr/index.html
schema_registry_client = SchemaRegistryClient(schema_registry_conf)

# 2. Configure AvroSerializer
avro_serializer = AvroSerializer(schema_registry_client,
                                 user_schema_str,
                                 lambda user, ctx: user.to_dict())

# 3. Configure Producer
producer_conf = {
    'bootstrap.servers': 'localhost:9092',
    'key.serializer': StringSerializer('utf_8'),
    'value.serializer': avro_serializer
}
producer = Producer(producer_conf)

# 4. Produce messages
producer.produce('my-topic', key='user1', value=some_user_object)
producer.flush()

Asynchronous Client & Serializers (AsyncIO)

Use the AsyncSchemaRegistryClient and Async serializers with AIOProducer and AIOConsumer. The configuration is the same as the synchronous client.

from confluent_kafka.aio import AIOProducer
from confluent_kafka.schema_registry import AsyncSchemaRegistryClient
from confluent_kafka.schema_registry._async.avro import AsyncAvroSerializer

# Setup async Schema Registry client and serializer
# (See configuration options in the synchronous example above)
schema_registry_conf = {'url': 'http://localhost:8081'}
schema_client = AsyncSchemaRegistryClient(schema_registry_conf)
serializer = await AsyncAvroSerializer(schema_client, schema_str=avro_schema)

# Use with AsyncIO producer
producer = AIOProducer({"bootstrap.servers": "localhost:9092"})
serialized_value = await serializer(data, SerializationContext("topic", MessageField.VALUE))
delivery_future = await producer.produce("topic", value=serialized_value)

Available async serializers: AsyncAvroSerializer, AsyncJSONSerializer, AsyncProtobufSerializer (and corresponding deserializers).

See also:

Import paths

from confluent_kafka.schema_registry._async.avro import AsyncAvroSerializer, AsyncAvroDeserializer
from confluent_kafka.schema_registry._async.json_schema import AsyncJSONSerializer, AsyncJSONDeserializer
from confluent_kafka.schema_registry._async.protobuf import AsyncProtobufSerializer, AsyncProtobufDeserializer

Client-Side Field Level Encryption (CSFLE): To use Data Contracts rules (including CSFLE), install the rules extra (see Install section), and refer to the encryption examples in examples/README.md. For CSFLE-specific guidance, see the Confluent Cloud CSFLE documentation.

Note: The async Schema Registry interface mirrors the synchronous client exactly - same configuration options, same calling patterns, no unexpected gotchas or limitations. Simply add await to method calls and use the Async prefixed classes.

Troubleshooting

  • 401/403 Unauthorized when using Confluent Cloud: Verify your basic.auth.user.info (SR API key/secret) is correct and that the Schema Registry URL is for your specific cluster. Ensure you are using an SR API key, not a Kafka API key.
  • Schema not found: Check that your subject.name.strategy configuration matches how your schemas are registered in Schema Registry, and that the topic and message field (key/value) pairing is correct.

Basic Consumer example

from confluent_kafka import Consumer

c = Consumer({
    'bootstrap.servers': 'mybroker',
    'group.id': 'mygroup',
    'auto.offset.reset': 'earliest'
})

c.subscribe(['mytopic'])

while True:
    msg = c.poll(1.0)

    if msg is None:
        continue
    if msg.error():
        print("Consumer error: {}".format(msg.error()))
        continue

    print('Received message: {}'.format(msg.value().decode('utf-8')))

c.close()

Basic AdminClient example

Create topics:

from confluent_kafka.admin import AdminClient, NewTopic

a = AdminClient({'bootstrap.servers': 'mybroker'})

new_topics = [NewTopic(topic, num_partitions=3, replication_factor=1) for topic in ["topic1", "topic2"]]
# Note: In a multi-cluster production scenario, it is more typical to use a replication_factor of 3 for durability.

# Call create_topics to asynchronously create topics. A dict
# of <topic,future> is returned.
fs = a.create_topics(new_topics)

# Wait for each operation to finish.
for topic, f in fs.items():
    try:
        f.result()  # The result itself is None
        print("Topic {} created".format(topic))
    except Exception as e:
        print("Failed to create topic {}: {}".format(topic, e))

Thread safety

The Producer, Consumer, and AdminClient are all thread safe.

Install

# Basic installation
pip install confluent-kafka

# With Schema Registry support
pip install "confluent-kafka[avro,schemaregistry]"     # Avro
pip install "confluent-kafka[json,schemaregistry]"     # JSON Schema  
pip install "confluent-kafka[protobuf,schemaregistry]" # Protobuf

# With Data Contract rules (includes CSFLE support)
pip install "confluent-kafka[avro,schemaregistry,rules]"

Note: Pre-built Linux wheels do not include SASL Kerberos/GSSAPI support. For Kerberos, see the source installation instructions in INSTALL.md. To use Schema Registry with the Avro serializer/deserializer:

pip install "confluent-kafka[avro,schemaregistry]"

To use Schema Registry with the JSON serializer/deserializer:

pip install "confluent-kafka[json,schemaregistry]"

To use Schema Registry with the Protobuf serializer/deserializer:

pip install "confluent-kafka[protobuf,schemaregistry]"

When using Data Contract rules (including CSFLE) add the rulesextra, e.g.:

pip install "confluent-kafka[avro,schemaregistry,rules]"

Install from source

For source install, see the Install from source section in INSTALL.md.

Broker compatibility

The Python client (as well as the underlying C library librdkafka) supports all broker versions >= 0.8. But due to the nature of the Kafka protocol in broker versions 0.8 and 0.9 it is not safe for a client to assume what protocol version is actually supported by the broker, thus you will need to hint the Python client what protocol version it may use. This is done through two configuration settings:

  • broker.version.fallback=YOUR_BROKER_VERSION (default 0.9.0.1)
  • api.version.request=true|false (default true)

When using a Kafka 0.10 broker or later you don't need to do anything

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

confluent_kafka-2.12.0b1-cp313-cp313-win_amd64.whl (4.1 MB view details)

Uploaded CPython 3.13Windows x86-64

confluent_kafka-2.12.0b1-cp313-cp313-manylinux_2_28_x86_64.whl (3.9 MB view details)

Uploaded CPython 3.13manylinux: glibc 2.28+ x86-64

confluent_kafka-2.12.0b1-cp313-cp313-manylinux_2_28_aarch64.whl (3.7 MB view details)

Uploaded CPython 3.13manylinux: glibc 2.28+ ARM64

confluent_kafka-2.12.0b1-cp313-cp313-macosx_13_0_x86_64.whl (3.6 MB view details)

Uploaded CPython 3.13macOS 13.0+ x86-64

confluent_kafka-2.12.0b1-cp313-cp313-macosx_13_0_arm64.whl (3.2 MB view details)

Uploaded CPython 3.13macOS 13.0+ ARM64

confluent_kafka-2.12.0b1-cp312-cp312-win_amd64.whl (4.1 MB view details)

Uploaded CPython 3.12Windows x86-64

confluent_kafka-2.12.0b1-cp312-cp312-manylinux_2_28_x86_64.whl (3.9 MB view details)

Uploaded CPython 3.12manylinux: glibc 2.28+ x86-64

confluent_kafka-2.12.0b1-cp312-cp312-manylinux_2_28_aarch64.whl (3.7 MB view details)

Uploaded CPython 3.12manylinux: glibc 2.28+ ARM64

confluent_kafka-2.12.0b1-cp312-cp312-macosx_11_0_arm64.whl (3.1 MB view details)

Uploaded CPython 3.12macOS 11.0+ ARM64

confluent_kafka-2.12.0b1-cp312-cp312-macosx_10_9_x86_64.whl (3.6 MB view details)

Uploaded CPython 3.12macOS 10.9+ x86-64

confluent_kafka-2.12.0b1-cp311-cp311-win_amd64.whl (4.1 MB view details)

Uploaded CPython 3.11Windows x86-64

confluent_kafka-2.12.0b1-cp311-cp311-manylinux_2_28_x86_64.whl (3.9 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.28+ x86-64

confluent_kafka-2.12.0b1-cp311-cp311-manylinux_2_28_aarch64.whl (3.7 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.28+ ARM64

confluent_kafka-2.12.0b1-cp311-cp311-macosx_11_0_arm64.whl (3.1 MB view details)

Uploaded CPython 3.11macOS 11.0+ ARM64

confluent_kafka-2.12.0b1-cp311-cp311-macosx_10_9_x86_64.whl (3.6 MB view details)

Uploaded CPython 3.11macOS 10.9+ x86-64

confluent_kafka-2.12.0b1-cp310-cp310-win_amd64.whl (4.1 MB view details)

Uploaded CPython 3.10Windows x86-64

confluent_kafka-2.12.0b1-cp310-cp310-manylinux_2_28_x86_64.whl (3.9 MB view details)

Uploaded CPython 3.10manylinux: glibc 2.28+ x86-64

confluent_kafka-2.12.0b1-cp310-cp310-manylinux_2_28_aarch64.whl (3.7 MB view details)

Uploaded CPython 3.10manylinux: glibc 2.28+ ARM64

confluent_kafka-2.12.0b1-cp310-cp310-macosx_11_0_arm64.whl (3.1 MB view details)

Uploaded CPython 3.10macOS 11.0+ ARM64

confluent_kafka-2.12.0b1-cp310-cp310-macosx_10_9_x86_64.whl (3.6 MB view details)

Uploaded CPython 3.10macOS 10.9+ x86-64

confluent_kafka-2.12.0b1-cp39-cp39-win_amd64.whl (4.1 MB view details)

Uploaded CPython 3.9Windows x86-64

confluent_kafka-2.12.0b1-cp39-cp39-manylinux_2_28_x86_64.whl (3.9 MB view details)

Uploaded CPython 3.9manylinux: glibc 2.28+ x86-64

confluent_kafka-2.12.0b1-cp39-cp39-manylinux_2_28_aarch64.whl (3.7 MB view details)

Uploaded CPython 3.9manylinux: glibc 2.28+ ARM64

confluent_kafka-2.12.0b1-cp39-cp39-macosx_11_0_arm64.whl (3.1 MB view details)

Uploaded CPython 3.9macOS 11.0+ ARM64

confluent_kafka-2.12.0b1-cp39-cp39-macosx_10_9_x86_64.whl (3.6 MB view details)

Uploaded CPython 3.9macOS 10.9+ x86-64

confluent_kafka-2.12.0b1-cp38-cp38-win_amd64.whl (4.1 MB view details)

Uploaded CPython 3.8Windows x86-64

confluent_kafka-2.12.0b1-cp38-cp38-manylinux_2_28_x86_64.whl (4.1 MB view details)

Uploaded CPython 3.8manylinux: glibc 2.28+ x86-64

confluent_kafka-2.12.0b1-cp38-cp38-manylinux_2_28_aarch64.whl (3.9 MB view details)

Uploaded CPython 3.8manylinux: glibc 2.28+ ARM64

confluent_kafka-2.12.0b1-cp38-cp38-macosx_11_0_arm64.whl (3.1 MB view details)

Uploaded CPython 3.8macOS 11.0+ ARM64

confluent_kafka-2.12.0b1-cp38-cp38-macosx_10_9_x86_64.whl (3.6 MB view details)

Uploaded CPython 3.8macOS 10.9+ x86-64

confluent_kafka-2.12.0b1-cp37-cp37m-win_amd64.whl (4.1 MB view details)

Uploaded CPython 3.7mWindows x86-64

confluent_kafka-2.12.0b1-cp37-cp37m-manylinux_2_28_x86_64.whl (4.1 MB view details)

Uploaded CPython 3.7mmanylinux: glibc 2.28+ x86-64

confluent_kafka-2.12.0b1-cp37-cp37m-manylinux_2_28_aarch64.whl (3.9 MB view details)

Uploaded CPython 3.7mmanylinux: glibc 2.28+ ARM64

confluent_kafka-2.12.0b1-cp37-cp37m-macosx_10_9_x86_64.whl (3.6 MB view details)

Uploaded CPython 3.7mmacOS 10.9+ x86-64

File details

Details for the file confluent_kafka-2.12.0b1-cp313-cp313-win_amd64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0b1-cp313-cp313-win_amd64.whl
Algorithm Hash digest
SHA256 027c765641f566150a75f28dc8e352d0aa22144f607928f5ea943b9e23176cac
MD5 cd8b7955c1a6279be639c99739fc3f20
BLAKE2b-256 abf93dfb2e1e25c3992df200ab40a41fdadc1b50d3d68728ecf28c1f9c38a7db

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0b1-cp313-cp313-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0b1-cp313-cp313-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 7551bdeeb6cf2e21356b2747654945ea1fdd8c4c2538de5c59553146117d0f1a
MD5 d16410df01a095746f0e0808fee085d4
BLAKE2b-256 b631840bfa1e03fc43370e68370f5df65d212010cceeefc88811a845e7f5df15

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0b1-cp313-cp313-manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0b1-cp313-cp313-manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 66fbf1ee2addb0881bb63b6f0becb6393c41e2cf079c49d405aaa13ba09f10fb
MD5 ff3e0202cc913bce87718f3e68f17708
BLAKE2b-256 be5e0f3635adbd1f979fa8405798b1fe6be4a34287dc7d3881c6b9f8992ed34d

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0b1-cp313-cp313-macosx_13_0_x86_64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0b1-cp313-cp313-macosx_13_0_x86_64.whl
Algorithm Hash digest
SHA256 53aec0fa7133eae0a76457e41942e6bd663057e8a5ea9ec61b7d0ed138acd614
MD5 cf6d7e48739aa45f7a1ef4d2e2321a05
BLAKE2b-256 4859d7d5dcc451b2810f6a7e275678546ea73ad5ad7fddcdec983a4539400464

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0b1-cp313-cp313-macosx_13_0_arm64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0b1-cp313-cp313-macosx_13_0_arm64.whl
Algorithm Hash digest
SHA256 6ea57fa5a3ce76396a9fb9c3d70f600eb998a22e2c09c10cd423744f7e9e2475
MD5 b9121c60b16e491ddac7d23c428c10c3
BLAKE2b-256 b20e2994cc5328fcbf3a64bd05827409c167c7462e77f5a283ee608c8a89dbc5

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0b1-cp312-cp312-win_amd64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0b1-cp312-cp312-win_amd64.whl
Algorithm Hash digest
SHA256 ef919c17b94b4b0455dc47fdd2274845dbf980724a64915c10b26ac420022141
MD5 5f099b4f970ace56b24b8dd8ac5cbbee
BLAKE2b-256 7fc46fbeb99fcea8eb8b16bffbffb74bf6a2502d2abc47091d3e684307e68841

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0b1-cp312-cp312-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0b1-cp312-cp312-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 d5899400fba07edc85a83b702525784c63be6b9ed7a7acfdfa6546c220adbd31
MD5 1f9a700c8c6478162ee36f6c7950a897
BLAKE2b-256 e4da25d12f83b48febd2c6e1d9375c2e2d41834f1c7db575efe074683af0b454

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0b1-cp312-cp312-manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0b1-cp312-cp312-manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 a43e8b941a77bdf99e36ab0436e04ca668d2828f1a2a0487fa4381ec479cb7fb
MD5 ef8520d451bfcc135b1a84c8e67e95ee
BLAKE2b-256 8c0999f31ae68d5668e138142ec05b0ebc9c81b12d375435eecf88de9201a11d

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0b1-cp312-cp312-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0b1-cp312-cp312-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 5404ab74fddab804879ad1baafad82fc9ada4a03ebc2ccdb34efd9ed012b6a29
MD5 4943a5acd3ac84ec30a14199d7e65199
BLAKE2b-256 f3f7d60936bf52d8b1911164ab350ab2861a3fb011bf79d10e3ce05c29c26442

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0b1-cp312-cp312-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0b1-cp312-cp312-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 efc24b59f356cd41201ca047e62caa5a22c79f14c53495eee59dc0581523c56c
MD5 53f97ce7b834e63961255ae506fa4a55
BLAKE2b-256 c03b429e22c63ff873bc1a58d7cfd80bef91e782f4e58caad3e0dcfe9ac26a32

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0b1-cp311-cp311-win_amd64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0b1-cp311-cp311-win_amd64.whl
Algorithm Hash digest
SHA256 46d509a4a0600802bac57ba9ce4f3f4da7df1ddf5ae0014ded6d68eeb97dade3
MD5 1fd7c2e4753b3b1cebe244882213e1a1
BLAKE2b-256 6a4cd923bfe889fc230d1fe62a8c20069fd7c019b3e5d4979daa4b494de3def1

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0b1-cp311-cp311-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0b1-cp311-cp311-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 2f2d1818a9d8db906e3dc6c4f73fb6ba7a249770d0ececfd5a56b662c7ef964c
MD5 5524cee70dd06b52c50b80bb88917272
BLAKE2b-256 8d702c75bf3e026f2e3a9c237cdae3bb10713d29106865c2f42ea93f58e52ee5

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0b1-cp311-cp311-manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0b1-cp311-cp311-manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 72e90bd18b63da443264ce85968aa9a7a5db30345c7b7aaf4db00bad02546f17
MD5 e2f279906f35519dd63b215029a026fd
BLAKE2b-256 1d9a45a5a9fc232d548564780b014c3954a23cd8fcc5813e72f75677119ff596

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0b1-cp311-cp311-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0b1-cp311-cp311-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 ca7294964aafa14d2bac6430cf5f112bae5cd188e780db14324ceecd0f6c5419
MD5 30c6909c1c455523522260e3fcedfa80
BLAKE2b-256 ccb58d2a1429960c1569cc3aac0599f19647106f312cfbb642d3364c0f94c338

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0b1-cp311-cp311-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0b1-cp311-cp311-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 8b3a21587c9a084cf433a20be78f3cfea6a9f8eea1ecc44e920ec8964320a306
MD5 842c91e82d534f9c6bb8b05c19a49a3c
BLAKE2b-256 8edf0c969cd8a6af815a3f1b66f433e549d8c2f7729b5210e5f6c3278b3d38a6

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0b1-cp310-cp310-win_amd64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0b1-cp310-cp310-win_amd64.whl
Algorithm Hash digest
SHA256 051e3c2f903d119a092cba1c044f800c1ae9125b0f932055f1d60fd4e24f410d
MD5 47f32b3a0889b45cbbfa707454c6bda9
BLAKE2b-256 5015699046a1d2da92f8caba76c6c184b9cfa2c0283183c86882fcaf4ecf8dde

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0b1-cp310-cp310-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0b1-cp310-cp310-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 0c444004aa883e9ed6a845f897430205e7b7e7197c985e21620b8367af236662
MD5 129a9ef6f490d6056902578b256e3809
BLAKE2b-256 f3310de3a0007d11775999c581aae3482d9abebebe792f9d2c45bcc235b64b21

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0b1-cp310-cp310-manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0b1-cp310-cp310-manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 d735a0ec6d53db32967564d75e373350d7727e7ab60545b58787cba3471aef4b
MD5 95ea47c802b4c51081389f78427dc5ab
BLAKE2b-256 6f5942244627641c2defbaa52ef340b55bf57278f23b4c55c2dc800a39e68348

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0b1-cp310-cp310-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0b1-cp310-cp310-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 44a0cc6208aa7fc9d8325dbda1732d467a8e20bd917da6e278ee912a1c3bb705
MD5 a60d419bad28edecbe36315903c33ffe
BLAKE2b-256 e0ffa9adc85471bdc62204b7dea8f8747b6192da495e432be6dc44b18b2987bd

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0b1-cp310-cp310-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0b1-cp310-cp310-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 4a12fe73fd6ca8832e3dd660558e6716ac37156005a08ced73206fdd08e6cd07
MD5 6dd9d46d3a19d2705ae0874e149131ad
BLAKE2b-256 06b94eed4f33279412a6fbdbfd68458bd5a424afaacbd1f47ed6acd524569901

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0b1-cp39-cp39-win_amd64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0b1-cp39-cp39-win_amd64.whl
Algorithm Hash digest
SHA256 f3274ffcddf527eb4dcaa7d7996e1316c2df15be69a380c376cd6bef04315e49
MD5 ac1a385de39681e9931e7a47dc4f420b
BLAKE2b-256 c8eec15b1ef7822f4c4daf2fcdaf5594822663e28370f56968634978a791759e

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0b1-cp39-cp39-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0b1-cp39-cp39-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 ec29287970029c470331be9a410346bcd5e6b43201336cdd849e7031ec351a48
MD5 9730258b72479a2552c419417bd2cdaa
BLAKE2b-256 e76876983ecb1b6e1a8ff3e3d6cba077a1c7ad0f0d592200a5a96652ef487c36

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0b1-cp39-cp39-manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0b1-cp39-cp39-manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 c1b052647d18263d99315cf7fb007e1d2af98b23a79c9c110360044e721c5bc6
MD5 0cbf98a0ded52df3543afce4f53046a7
BLAKE2b-256 45d1671611736dd9e6914e8e9665a1bb1efcbe954a76484901c0f688c68ef98f

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0b1-cp39-cp39-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0b1-cp39-cp39-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 8c4c80ce7dbfaa8e8c34c3a7a382c9cf9c679f6a1631173e0174205c565b5e5a
MD5 be22015fd04b8ce0b55c199fb00159d5
BLAKE2b-256 ff10d451981bbc8f09e8ab4af5cedf21a18193224f57b80a9dc66cd66ac515b5

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0b1-cp39-cp39-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0b1-cp39-cp39-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 aafe8d26b27974e6e93f2798d9eba264b139bee3c7996889b72b39053549a4f2
MD5 1addfc2ffdaa17aa46eee55c69b50972
BLAKE2b-256 3321defc887e55734489376058a5fedaf429d182c592a40080a091f9ad055643

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0b1-cp38-cp38-win_amd64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0b1-cp38-cp38-win_amd64.whl
Algorithm Hash digest
SHA256 6157b18f2b1fcec678102edeed84b47046ebdda68c4b4eb573256ee59f587dfa
MD5 3e1719ea2e6c85407e3d5582371a1907
BLAKE2b-256 10d2f6834212b5e863e4ad9696d8df7910c2d1f758337bee1c2a1c00afb2175a

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0b1-cp38-cp38-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0b1-cp38-cp38-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 75126efa168baa4b7602a83988804e04d1836971803ee8dcb58f454d786616b1
MD5 b6a3602f7a6d8de1031155f27f900c3d
BLAKE2b-256 fd874f25cd9227759e2ae0bd9cb83a511343049ae65054f25ec64322204cafc7

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0b1-cp38-cp38-manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0b1-cp38-cp38-manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 caf088109c7da582296ba1e0fafa39edd66e9d67da1530c4729f496c7be3608d
MD5 1d1e6c4b8c8e4786c3e63c314767ad72
BLAKE2b-256 4020486883b305b39ba0771d2efce0de56e1f7c2e0038d0b7c4554193148d6fe

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0b1-cp38-cp38-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0b1-cp38-cp38-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 057c5db16d6fa83728e97733948472beae1d76262a903a28a1aa5b999d490f4b
MD5 6b7245c01ca46815d5a78b5529a1d7e0
BLAKE2b-256 b5494d4380a2c05d8e2f54427b43b57d23cd9681d28f972a3dd5a7c126e64d9c

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0b1-cp38-cp38-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0b1-cp38-cp38-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 2c1a88a76594e0dff69133cf560f816c3e6a8960da006d321c9a0202ce1425e5
MD5 8c81a716cdb7b20b23513c6f23fb2674
BLAKE2b-256 59f4adc4462e964c697a4151ba9cdeaabd91bca73c8c1caae8244fd6f1d94d33

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0b1-cp37-cp37m-win_amd64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0b1-cp37-cp37m-win_amd64.whl
Algorithm Hash digest
SHA256 ec43e794951561eaa4d84f1080fe7457dea7111720fe3ad39fb811a801f7ffc8
MD5 883524ebce452652b81318cbfb181d83
BLAKE2b-256 cf6bddbb8df8c4f554978cd8be92cb3b62dd1101ad718c4b6d1218666b0dae6f

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0b1-cp37-cp37m-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0b1-cp37-cp37m-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 c4f008805b5d0c08004e3b476a935ad03be2b0989b6e2f258b5e1f795d6b4690
MD5 53210f589658bfaf6ae16cc5f8570f7d
BLAKE2b-256 a5a28fe35f49d3686d60ded35c044cf83f83a98f30d6fac79427583af98e2e6a

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0b1-cp37-cp37m-manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0b1-cp37-cp37m-manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 bf38393416ac85b8dada021d99ba9923b48f046e8ee1d90657114ad98fbf3a77
MD5 2aca7304d2e6ec43f6997af0f7507451
BLAKE2b-256 557e14b9496701ab1967f8bd328c35f403cfa5bf5255e9423fde27c2869d4247

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0b1-cp37-cp37m-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0b1-cp37-cp37m-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 92ebbc2dd6cbdffe3c5233bf749d2022dc565b5ef02d81342b6bf35dc65d00f9
MD5 184e7c86ea2990ac8a9ee15b8fa6bd29
BLAKE2b-256 82e3a58c960d451bce672666336d28e4c1615c97b2575c5071f19f9966988a95

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page