Skip to main content

Confluent's Python client for Apache Kafka

Project description

Confluent Python Client for Apache Kafka

Try Confluent Cloud - The Data Streaming Platform

Confluent's Python Client for Apache KafkaTM

confluent-kafka-python provides a high-level Producer, Consumer and AdminClient compatible with all Apache Kafka™ brokers >= v0.8, Confluent Cloud and Confluent Platform.

Recommended for Production: While this client works with any Kafka deployment, it's optimized for and fully supported with Confluent Cloud (fully managed) and Confluent Platform (self-managed), which provide enterprise-grade security, monitoring, and support.

Why Choose Confluent's Python Client?

Unlike the basic Apache Kafka Python client, confluent-kafka-python provides:

  • Production-Ready Performance: Built on librdkafka (C library) for maximum throughput and minimal latency, significantly outperforming pure Python implementations.
  • Enterprise Features: Schema Registry integration, transactions, exactly-once semantics, and advanced serialization support out of the box.
  • AsyncIO Support: Native async/await support for modern Python applications - not available in the Apache Kafka client.
  • Comprehensive Serialization: Built-in Avro, Protobuf, and JSON Schema support with automatic schema evolution handling.
  • Professional Support: Backed by Confluent's engineering team with enterprise SLAs and 24/7 support options.
  • Active Development: Continuously updated with the latest Kafka features and performance optimizations.
  • Battle-Tested: Used by thousands of organizations in production, from startups to Fortune 500 companies.

Performance Note: The Apache Kafka Python client (kafka-python) is a pure Python implementation that, while functional, has significant performance limitations for high-throughput production use cases. confluent-kafka-python leverages the same high-performance C library (librdkafka) used by Confluent's other clients, providing enterprise-grade performance and reliability.

Key Features

  • High Performance & Reliability: Built on librdkafka, the battle-tested C client for Apache Kafka, ensuring maximum throughput, low latency, and stability. The client is supported by Confluent and is trusted in mission-critical production environments.
  • Comprehensive Kafka Support: Full support for the Kafka protocol, transactions, and administration APIs.
  • Experimental; AsyncIO Producer: An experimental fully asynchronous producer (AIOProducer) for seamless integration with modern Python applications using asyncio.
  • Seamless Schema Registry Integration: Synchronous and asynchronous clients for Confluent Schema Registry to handle schema management and serialization (Avro, Protobuf, JSON Schema).
  • Improved Error Handling: Detailed, context-aware error messages and exceptions to speed up debugging and troubleshooting.
  • [Confluent Cloud] Automatic Zone Detection: Producers automatically connect to brokers in the same availability zone, reducing latency and data transfer costs without requiring manual configuration.
  • [Confluent Cloud] Simplified Configuration Profiles: Pre-defined configuration profiles optimized for common use cases like high throughput or low latency, simplifying client setup.
  • Enterprise Support: Backed by Confluent's expert support team with SLAs and 24/7 assistance for production deployments.

Usage

For a step-by-step guide on using the client, see Getting Started with Apache Kafka and Python.

Choosing Your Kafka Deployment

  • Confluent Cloud - Fully managed service with automatic scaling, security, and monitoring. Best for teams wanting to focus on applications rather than infrastructure.
  • Confluent Platform - Self-managed deployment with enterprise features, support, and tooling. Ideal for on-premises or hybrid cloud requirements.
  • Apache Kafka - Open source deployment. Requires manual setup, monitoring, and maintenance.

Additional examples can be found in the examples directory or the confluentinc/examples GitHub repo, which include demonstrations of:

  • Exactly once data processing using the transactional API.
  • Integration with asyncio.
  • (De)serializing Protobuf, JSON, and Avro data with Confluent Schema Registry integration.
  • Confluent Cloud configuration.

Also see the Python client docs and the API reference.

Finally, the tests are useful as a reference for example usage.

AsyncIO Producer (experimental)

Use the AsyncIO Producer inside async applications to avoid blocking the event loop.

import asyncio
from confluent_kafka.experimental.aio import AIOProducer

async def main():
    p = AIOProducer({"bootstrap.servers": "mybroker"})
    try:
        # produce() returns a Future; first await the coroutine to get the Future,
        # then await the Future to get the delivered Message.
        delivery_future = await p.produce("mytopic", value=b"hello")
        delivered_msg = await delivery_future
        # Optionally flush any remaining buffered messages before shutdown
        await p.flush()
    finally:
        await p.close()

asyncio.run(main())

Notes:

  • Batched async produce buffers messages; delivery callbacks, stats, errors, and logger run on the event loop.
  • Per-message headers are not supported in the batched async path. If headers are required, use the synchronous Producer.produce(...) (you can offload to a thread in async apps).

For a more detailed example that includes both an async producer and consumer, see examples/asyncio_example.py.

Architecture: For implementation details and component architecture, see the AIOProducer Architecture Overview.

When to use AsyncIO vs synchronous Producer

  • Use AsyncIO Producer when your code runs under an event loop (FastAPI/Starlette, aiohttp, Sanic, asyncio workers) and must not block.
  • Use synchronous Producer for scripts, batch jobs, and highest-throughput pipelines where you control threads/processes and can call poll()/flush() directly.
  • In async servers, prefer AsyncIO Producer; if you need headers, call sync produce() via run_in_executor for that path.

AsyncIO with Schema Registry

The AsyncIO producer and consumer integrate seamlessly with async Schema Registry serializers. See the Schema Registry Integration section below for full details.

Basic Producer example

from confluent_kafka import Producer

p = Producer({'bootstrap.servers': 'mybroker1,mybroker2'})

def delivery_report(err, msg):
    """ Called once for each message produced to indicate delivery result.
        Triggered by poll() or flush(). """
    if err is not None:
        print('Message delivery failed: {}'.format(err))
    else:
        print('Message delivered to {} [{}]'.format(msg.topic(), msg.partition()))

for data in some_data_source:
    # Trigger any available delivery report callbacks from previous produce() calls
    p.poll(0)

    # Asynchronously produce a message. The delivery report callback will
    # be triggered from the call to poll() above, or flush() below, when the
    # message has been successfully delivered or failed permanently.
    p.produce('mytopic', data.encode('utf-8'), callback=delivery_report)

# Wait for any outstanding messages to be delivered and delivery report
# callbacks to be triggered.
p.flush()

For a discussion on the poll based producer API, refer to the Integrating Apache Kafka With Python Asyncio Web Applications blog post.

Schema Registry Integration

This client provides full integration with Schema Registry for schema management and message serialization, and is compatible with both Confluent Platform and Confluent Cloud. Both synchronous and asynchronous clients are available.

Learn more

Synchronous Client & Serializers

Use the synchronous SchemaRegistryClient with the standard Producer and Consumer.

from confluent_kafka import Producer
from confluent_kafka.schema_registry import SchemaRegistryClient
from confluent_kafka.schema_registry.avro import AvroSerializer
from confluent_kafka.serialization import StringSerializer, SerializationContext, MessageField

# Configure Schema Registry Client
schema_registry_conf = {'url': 'http://localhost:8081'}  # Confluent Platform
# For Confluent Cloud, add: 'basic.auth.user.info': '<sr-api-key>:<sr-api-secret>'
# See: https://docs.confluent.io/cloud/current/sr/index.html
schema_registry_client = SchemaRegistryClient(schema_registry_conf)

# 2. Configure AvroSerializer
avro_serializer = AvroSerializer(schema_registry_client,
                                 user_schema_str,
                                 lambda user, ctx: user.to_dict())

# 3. Configure Producer
producer_conf = {
    'bootstrap.servers': 'localhost:9092',
    'key.serializer': StringSerializer('utf_8'),
    'value.serializer': avro_serializer
}
producer = Producer(producer_conf)

# 4. Produce messages
producer.produce('my-topic', key='user1', value=some_user_object)
producer.flush()

Asynchronous Client & Serializers (AsyncIO)

Use the AsyncSchemaRegistryClient and Async serializers with AIOProducer and AIOConsumer. The configuration is the same as the synchronous client.

from confluent_kafka.experimental.aio import AIOProducer
from confluent_kafka.schema_registry import AsyncSchemaRegistryClient
from confluent_kafka.schema_registry._async.avro import AsyncAvroSerializer

# Setup async Schema Registry client and serializer
# (See configuration options in the synchronous example above)
schema_registry_conf = {'url': 'http://localhost:8081'}
schema_client = AsyncSchemaRegistryClient(schema_registry_conf)
serializer = await AsyncAvroSerializer(schema_client, schema_str=avro_schema)

# Use with AsyncIO producer
producer = AIOProducer({"bootstrap.servers": "localhost:9092"})
serialized_value = await serializer(data, SerializationContext("topic", MessageField.VALUE))
delivery_future = await producer.produce("topic", value=serialized_value)

Available async serializers: AsyncAvroSerializer, AsyncJSONSerializer, AsyncProtobufSerializer (and corresponding deserializers).

See also:

Import paths

from confluent_kafka.schema_registry._async.avro import AsyncAvroSerializer, AsyncAvroDeserializer
from confluent_kafka.schema_registry._async.json_schema import AsyncJSONSerializer, AsyncJSONDeserializer
from confluent_kafka.schema_registry._async.protobuf import AsyncProtobufSerializer, AsyncProtobufDeserializer

Client-Side Field Level Encryption (CSFLE): To use Data Contracts rules (including CSFLE), install the rules extra (see Install section), and refer to the encryption examples in examples/README.md. For CSFLE-specific guidance, see the Confluent Cloud CSFLE documentation.

Note: The async Schema Registry interface mirrors the synchronous client exactly - same configuration options, same calling patterns, no unexpected gotchas or limitations. Simply add await to method calls and use the Async prefixed classes.

Troubleshooting

  • 401/403 Unauthorized when using Confluent Cloud: Verify your basic.auth.user.info (SR API key/secret) is correct and that the Schema Registry URL is for your specific cluster. Ensure you are using an SR API key, not a Kafka API key.
  • Schema not found: Check that your subject.name.strategy configuration matches how your schemas are registered in Schema Registry, and that the topic and message field (key/value) pairing is correct.

Basic Consumer example

from confluent_kafka import Consumer

c = Consumer({
    'bootstrap.servers': 'mybroker',
    'group.id': 'mygroup',
    'auto.offset.reset': 'earliest'
})

c.subscribe(['mytopic'])

while True:
    msg = c.poll(1.0)

    if msg is None:
        continue
    if msg.error():
        print("Consumer error: {}".format(msg.error()))
        continue

    print('Received message: {}'.format(msg.value().decode('utf-8')))

c.close()

Basic AdminClient example

Create topics:

from confluent_kafka.admin import AdminClient, NewTopic

a = AdminClient({'bootstrap.servers': 'mybroker'})

new_topics = [NewTopic(topic, num_partitions=3, replication_factor=1) for topic in ["topic1", "topic2"]]
# Note: In a multi-cluster production scenario, it is more typical to use a replication_factor of 3 for durability.

# Call create_topics to asynchronously create topics. A dict
# of <topic,future> is returned.
fs = a.create_topics(new_topics)

# Wait for each operation to finish.
for topic, f in fs.items():
    try:
        f.result()  # The result itself is None
        print("Topic {} created".format(topic))
    except Exception as e:
        print("Failed to create topic {}: {}".format(topic, e))

Thread safety

The Producer, Consumer, and AdminClient are all thread safe.

Install

# Basic installation
pip install confluent-kafka

# With Schema Registry support
pip install "confluent-kafka[avro,schemaregistry]"     # Avro
pip install "confluent-kafka[json,schemaregistry]"     # JSON Schema  
pip install "confluent-kafka[protobuf,schemaregistry]" # Protobuf

# With Data Contract rules (includes CSFLE support)
pip install "confluent-kafka[avro,schemaregistry,rules]"

Note: Pre-built Linux wheels do not include SASL Kerberos/GSSAPI support. For Kerberos, see the source installation instructions in INSTALL.md. To use Schema Registry with the Avro serializer/deserializer:

pip install "confluent-kafka[avro,schemaregistry]"

To use Schema Registry with the JSON serializer/deserializer:

pip install "confluent-kafka[json,schemaregistry]"

To use Schema Registry with the Protobuf serializer/deserializer:

pip install "confluent-kafka[protobuf,schemaregistry]"

When using Data Contract rules (including CSFLE) add the rulesextra, e.g.:

pip install "confluent-kafka[avro,schemaregistry,rules]"

Install from source

For source install, see the Install from source section in INSTALL.md.

Broker compatibility

The Python client (as well as the underlying C library librdkafka) supports all broker versions >= 0.8. But due to the nature of the Kafka protocol in broker versions 0.8 and 0.9 it is not safe for a client to assume what protocol version is actually supported by the broker, thus you will need to hint the Python client what protocol version it may use. This is done through two configuration settings:

  • broker.version.fallback=YOUR_BROKER_VERSION (default 0.9.0.1)
  • api.version.request=true|false (default true)

When using a Kafka 0.10 broker or later you don't need to do anything

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

confluent_kafka-2.12.0.tar.gz (250.3 kB view details)

Uploaded Source

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

confluent_kafka-2.12.0-cp313-cp313-win_amd64.whl (4.1 MB view details)

Uploaded CPython 3.13Windows x86-64

confluent_kafka-2.12.0-cp313-cp313-manylinux_2_28_x86_64.whl (3.9 MB view details)

Uploaded CPython 3.13manylinux: glibc 2.28+ x86-64

confluent_kafka-2.12.0-cp313-cp313-manylinux_2_28_aarch64.whl (3.7 MB view details)

Uploaded CPython 3.13manylinux: glibc 2.28+ ARM64

confluent_kafka-2.12.0-cp313-cp313-macosx_15_0_x86_64.whl (3.6 MB view details)

Uploaded CPython 3.13macOS 15.0+ x86-64

confluent_kafka-2.12.0-cp313-cp313-macosx_15_0_arm64.whl (3.2 MB view details)

Uploaded CPython 3.13macOS 15.0+ ARM64

confluent_kafka-2.12.0-cp312-cp312-win_amd64.whl (4.1 MB view details)

Uploaded CPython 3.12Windows x86-64

confluent_kafka-2.12.0-cp312-cp312-manylinux_2_28_x86_64.whl (3.9 MB view details)

Uploaded CPython 3.12manylinux: glibc 2.28+ x86-64

confluent_kafka-2.12.0-cp312-cp312-manylinux_2_28_aarch64.whl (3.7 MB view details)

Uploaded CPython 3.12manylinux: glibc 2.28+ ARM64

confluent_kafka-2.12.0-cp312-cp312-macosx_11_0_arm64.whl (3.2 MB view details)

Uploaded CPython 3.12macOS 11.0+ ARM64

confluent_kafka-2.12.0-cp312-cp312-macosx_10_9_x86_64.whl (3.6 MB view details)

Uploaded CPython 3.12macOS 10.9+ x86-64

confluent_kafka-2.12.0-cp311-cp311-win_amd64.whl (4.1 MB view details)

Uploaded CPython 3.11Windows x86-64

confluent_kafka-2.12.0-cp311-cp311-manylinux_2_28_x86_64.whl (3.9 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.28+ x86-64

confluent_kafka-2.12.0-cp311-cp311-manylinux_2_28_aarch64.whl (3.7 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.28+ ARM64

confluent_kafka-2.12.0-cp311-cp311-macosx_11_0_arm64.whl (3.2 MB view details)

Uploaded CPython 3.11macOS 11.0+ ARM64

confluent_kafka-2.12.0-cp311-cp311-macosx_10_9_x86_64.whl (3.6 MB view details)

Uploaded CPython 3.11macOS 10.9+ x86-64

confluent_kafka-2.12.0-cp310-cp310-win_amd64.whl (4.1 MB view details)

Uploaded CPython 3.10Windows x86-64

confluent_kafka-2.12.0-cp310-cp310-manylinux_2_28_x86_64.whl (3.9 MB view details)

Uploaded CPython 3.10manylinux: glibc 2.28+ x86-64

confluent_kafka-2.12.0-cp310-cp310-manylinux_2_28_aarch64.whl (3.7 MB view details)

Uploaded CPython 3.10manylinux: glibc 2.28+ ARM64

confluent_kafka-2.12.0-cp310-cp310-macosx_11_0_arm64.whl (3.2 MB view details)

Uploaded CPython 3.10macOS 11.0+ ARM64

confluent_kafka-2.12.0-cp310-cp310-macosx_10_9_x86_64.whl (3.6 MB view details)

Uploaded CPython 3.10macOS 10.9+ x86-64

confluent_kafka-2.12.0-cp39-cp39-win_amd64.whl (4.1 MB view details)

Uploaded CPython 3.9Windows x86-64

confluent_kafka-2.12.0-cp39-cp39-manylinux_2_28_x86_64.whl (3.9 MB view details)

Uploaded CPython 3.9manylinux: glibc 2.28+ x86-64

confluent_kafka-2.12.0-cp39-cp39-manylinux_2_28_aarch64.whl (3.7 MB view details)

Uploaded CPython 3.9manylinux: glibc 2.28+ ARM64

confluent_kafka-2.12.0-cp39-cp39-macosx_11_0_arm64.whl (3.2 MB view details)

Uploaded CPython 3.9macOS 11.0+ ARM64

confluent_kafka-2.12.0-cp39-cp39-macosx_10_9_x86_64.whl (3.6 MB view details)

Uploaded CPython 3.9macOS 10.9+ x86-64

confluent_kafka-2.12.0-cp38-cp38-win_amd64.whl (4.1 MB view details)

Uploaded CPython 3.8Windows x86-64

confluent_kafka-2.12.0-cp38-cp38-manylinux_2_28_x86_64.whl (4.1 MB view details)

Uploaded CPython 3.8manylinux: glibc 2.28+ x86-64

confluent_kafka-2.12.0-cp38-cp38-manylinux_2_28_aarch64.whl (3.9 MB view details)

Uploaded CPython 3.8manylinux: glibc 2.28+ ARM64

confluent_kafka-2.12.0-cp38-cp38-macosx_11_0_arm64.whl (3.2 MB view details)

Uploaded CPython 3.8macOS 11.0+ ARM64

confluent_kafka-2.12.0-cp38-cp38-macosx_10_9_x86_64.whl (3.6 MB view details)

Uploaded CPython 3.8macOS 10.9+ x86-64

confluent_kafka-2.12.0-cp37-cp37m-win_amd64.whl (4.1 MB view details)

Uploaded CPython 3.7mWindows x86-64

confluent_kafka-2.12.0-cp37-cp37m-manylinux_2_28_x86_64.whl (4.1 MB view details)

Uploaded CPython 3.7mmanylinux: glibc 2.28+ x86-64

confluent_kafka-2.12.0-cp37-cp37m-manylinux_2_28_aarch64.whl (3.9 MB view details)

Uploaded CPython 3.7mmanylinux: glibc 2.28+ ARM64

confluent_kafka-2.12.0-cp37-cp37m-macosx_10_9_x86_64.whl (3.6 MB view details)

Uploaded CPython 3.7mmacOS 10.9+ x86-64

File details

Details for the file confluent_kafka-2.12.0.tar.gz.

File metadata

  • Download URL: confluent_kafka-2.12.0.tar.gz
  • Upload date:
  • Size: 250.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.3

File hashes

Hashes for confluent_kafka-2.12.0.tar.gz
Algorithm Hash digest
SHA256 2a8d8734c6eaf30a751dde2ad75fc78b35aad60d66a3a7aa837c7513f36c29b7
MD5 a5b6cc6ffb3a809d92f1223f171ab5c4
BLAKE2b-256 785eceed3dd7e69ecdb1197b625fd6248fa1bbb44782def202609421b8a273fa

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0-cp313-cp313-win_amd64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0-cp313-cp313-win_amd64.whl
Algorithm Hash digest
SHA256 1ad3a90ff25bd6055b87091b4b8528361f8bdce0976417e22047a1c85f709c21
MD5 3129aca4e44dd043e402d474ca8ee037
BLAKE2b-256 ce5c2ced6677591913041fe1dd057f78a0b95a847169d89753c67c198c746e4c

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0-cp313-cp313-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0-cp313-cp313-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 1418d0029a171bb47242f9e82ace7c51b1c2f7d690fcb5dc30fa0649c251333e
MD5 c078d5358a8a32810b9cb3b4024d8e7f
BLAKE2b-256 6cbc1ccb48494d38e020bc5fbf68368ef81cb8d8f972b0ca5af88c9c94c27b98

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0-cp313-cp313-manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0-cp313-cp313-manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 5b236849e5f7a35c10b80aa9d6d5e64279af03ef25980b3f19ce2aa8af28902e
MD5 475a338621e6d632743ffd6d30583b21
BLAKE2b-256 f3779733f83fab79603437a6013bf99755ce9cf901ea6c84c8a8df39b09f983c

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0-cp313-cp313-macosx_15_0_x86_64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0-cp313-cp313-macosx_15_0_x86_64.whl
Algorithm Hash digest
SHA256 5be15a3a933f286bdd90a136c06d2f101e5d328b89add565a00164462e930f0c
MD5 f2ded62135cb4ba018b233fd983eab78
BLAKE2b-256 c892f948f50ee2d712a73cadc0b92dc90245216aebe8bfa249d8700cabcf5d86

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0-cp313-cp313-macosx_15_0_arm64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0-cp313-cp313-macosx_15_0_arm64.whl
Algorithm Hash digest
SHA256 e950516caef3419a2f1988f539890e3ac4252573dbb6f6dd5e039570ed0f1ccf
MD5 d82349f6d915dd2e539c622c720842db
BLAKE2b-256 9e41c098a35535d063f2a52930ec7d99f1d619fb0fa4cab7faf35a6fd159f834

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0-cp312-cp312-win_amd64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0-cp312-cp312-win_amd64.whl
Algorithm Hash digest
SHA256 5feda9d5e49b571264f4ce50f0b30b0d00b978cedd78fb863bf1defa4c0438dc
MD5 c7adee15c97e71cf78d857a36edda177
BLAKE2b-256 2078867a487bdfeadc6517a8e484ea26dcfc7131286c104dd94e012a3ae2dfe5

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0-cp312-cp312-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0-cp312-cp312-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 fb65a64745e5308b7890f8caf26261ea29bae32961412f80ff29bf982a0e35d9
MD5 3e97f9c08326a461c4a326c29e014067
BLAKE2b-256 ddf1c5f431a3ba8f654623fd834bffd446abe7cd09b75efcd8ded6a0168292c5

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0-cp312-cp312-manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0-cp312-cp312-manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 01219f2e93ab25377444721bb8080467d0dae66a584faf4eeadaab842f6a86b5
MD5 e21a8e5898b2d90b31be1517503ad3ea
BLAKE2b-256 553873041681c46ca651d614675bb3f714d6faabf68a51cfa9fec1ef12017972

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0-cp312-cp312-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0-cp312-cp312-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 71c0893037869ea8164ec62fd42995c93eee9d667eed8ca85ce3a36ab37341a8
MD5 5c23ed54156efb38d8525240e6f9784e
BLAKE2b-256 9af43e9fcdeed825ebae47b17326422c7cc8bcc92e717a1dc48912f60947230d

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0-cp312-cp312-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0-cp312-cp312-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 e41f18fd554c3916c769d0a4137c8b754eb2358101775b08ba79e6d75916c7c5
MD5 3d95a433f90102a1e2f05011edd40634
BLAKE2b-256 c5584acfe1f2ed78e0d788b9336c2805a14e77893bfcff307fc039347fd51d19

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0-cp311-cp311-win_amd64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0-cp311-cp311-win_amd64.whl
Algorithm Hash digest
SHA256 4b3b8321a7d84e01fae98e9eb982e73d974ba018a2db90c1c8f2a2f68313f865
MD5 6b7c86ca33c74ab1282551bc54d0a268
BLAKE2b-256 89a9d6d184d2885c1acb0f0f0d4eb55f674f4e7ecbc1c4232120b7a2f05db33e

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0-cp311-cp311-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0-cp311-cp311-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 5404760ec45d43257b68acf6e52071aeb9a613957008713e51d166f2e5b07965
MD5 4095ce6850b498fa9e83a20fc919437e
BLAKE2b-256 e8cfe09180b0d5123a401742af45e0795566e4b93500c197fcdc472bd42b371a

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0-cp311-cp311-manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0-cp311-cp311-manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 e4d4ee68eb92fce723dc3f79b4bedce552b575cdda9abe0b2adc387542ffb20d
MD5 1f54044bdfc437ad7c1f172a4a5ceb98
BLAKE2b-256 43a634b996cf39660b82d2b124369c88923ca0e49a7235f70c0071a33e0ac678

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0-cp311-cp311-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0-cp311-cp311-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 17d49c465a312f7cbb6b84aff5437f670b0db748660da241a2adf08ee72e86c7
MD5 75c6eb4e8242227e0c09f4553ba65bec
BLAKE2b-256 ddb549c6c3de995966f35f8c0343a7b31a5cae0506345a4e7df063c8cfe1266a

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0-cp311-cp311-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0-cp311-cp311-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 737a08fd5e1e3b73317b6c6ca26c7c54137fa9f734e16a389f2e9f8637b5aaa1
MD5 2e28c04e6483c992defd8f8a29b21a96
BLAKE2b-256 8dde76550b1e875aca34e2f1c8d369ec63136db3b19e0c43b0374a5f338ebe93

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0-cp310-cp310-win_amd64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0-cp310-cp310-win_amd64.whl
Algorithm Hash digest
SHA256 dd8362fc6a71b75463d77a68b882f78134d27bae817c51d76afb27cf93f60dc4
MD5 9bad8b040a2875e64d8cad36523d774b
BLAKE2b-256 14b3a64bb3f0e4b48665e2417d9d58b8bd79c0f3790467ecb05737468f2ca6c0

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0-cp310-cp310-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0-cp310-cp310-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 f4b20f646c1e3875d964f94c5db3c4b3cf3f8675319dfc9715c11f5dc30e4889
MD5 4af8b47efa649acc2c288b6490a8751f
BLAKE2b-256 581da0d61cf02a7163af78873152e999aa916500312aa285744857a5ef3b0c6d

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0-cp310-cp310-manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0-cp310-cp310-manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 ad423a07c642b6a6e0cea832d4ffdd28dea1f9a3700cac17d93b6ed012c3d16b
MD5 3c01ea8bc961857610e84d395b261fb8
BLAKE2b-256 a35b089a2e570f8c1e113247d17bd7133122944228dcc1a96a9e6d5199537ff3

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0-cp310-cp310-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0-cp310-cp310-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 140400d2a288c03e3948c1ff7cc654a089de195582c3a5e12e2e30306b71dcb6
MD5 a5fc03cabfc0fc8eafba2f5d7df250c3
BLAKE2b-256 46837b44ceb5bf2c24c15924637475e747aca6f7955edb25e95461f90b9d260c

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0-cp310-cp310-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0-cp310-cp310-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 1f083c9edf5a2d7b8bdf3c535f6d614243a242a88a408c6cf1afd72705b0e510
MD5 1ec8942795a3217312641e4a5cee05e1
BLAKE2b-256 1d609b2b322b12a5167372c4f474c5e4b79965bfefce56a10486b634d59d0e37

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0-cp39-cp39-win_amd64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0-cp39-cp39-win_amd64.whl
Algorithm Hash digest
SHA256 91ea4a61e81cb58ae5b40baa1e1c9d8afbaf7d4492fc056c914118d8d7606d40
MD5 b6005eec4f126c37e3fe617c66920ecc
BLAKE2b-256 8f577c1d6a17d460e217ebc3830dedbe3336cc6842cf0a62fd43c37e257018fd

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0-cp39-cp39-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0-cp39-cp39-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 3cf1c4f385388716324917202469cf23bf1cdc5e726e39fc47f4f996c34b2464
MD5 e31fc20f1aa91ec4509db5aa5526ee57
BLAKE2b-256 3b3513d2050194e11b0d14918df0f50f5207957867185f0d78663f380b405065

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0-cp39-cp39-manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0-cp39-cp39-manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 7bab42020e8aac3cd5097cd6bb7bbe368fb1c333e219a5b3a2ce74d98eb5e74a
MD5 ca710b10981e9bc0a2eccc963f14c2c7
BLAKE2b-256 9f043260e44761037ca16a69bf39582ed5450c03265e52d99f427425a9428e45

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0-cp39-cp39-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0-cp39-cp39-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 cafb88bf09c5e1153131b674169afe337b40f6f11c74a1f928acb98b61a035fa
MD5 1309f31ecdbd58b50d01465e515686d9
BLAKE2b-256 8dba87c4e368ffb2501c4224f2344e643139ab1ac469eb88c54c13eae8083945

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0-cp39-cp39-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0-cp39-cp39-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 ec8fe2e8da08a1ccd870a23ed0ab7144ff628334af9569eeb07bb59b9163f2be
MD5 08f5d68e6f79a10ad9b04773e4dd0bd4
BLAKE2b-256 ed664e409c8194adce068358222d3153e80d6f48c72bd4d61c661f594b3bf9f4

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0-cp38-cp38-win_amd64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0-cp38-cp38-win_amd64.whl
Algorithm Hash digest
SHA256 ef9474a190e47ad98dd489de55a038862f7b574952a64df763bb11d84d3c6242
MD5 a383afffe757faddad112293c1ddfbcf
BLAKE2b-256 a740207893f49afa171034f634e16b2c8cc3f5da6113db19a70756d48497f4e6

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0-cp38-cp38-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0-cp38-cp38-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 8d66097e6eff51f2aeb9e1a607b76024cb640e69ef5a9d72440b7fffb6f9f3db
MD5 3e45684de39f4b05a5662405438bfc80
BLAKE2b-256 bafa11a8948ad5729d5e1110e5c7d99a091729031bdfa2ca22226bb2d0cbbf91

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0-cp38-cp38-manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0-cp38-cp38-manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 9e5cc6981e6bc3b3970b8cbe95cbf9714e2eee7a228518b606a8c1206be3263d
MD5 0a3a0ce823598518f35295407c535252
BLAKE2b-256 6858346873c7cc016010a1d6cf7b92d34557edc6652973e0517a75bbc06556e6

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0-cp38-cp38-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0-cp38-cp38-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 dafdc2f355ad213e3083876f4b509e4301501a450b0f05cbb559fbd60df09168
MD5 79034d96b3d1a267d54a02fba10efaef
BLAKE2b-256 b58e0d52f07d7af9bbc01d5e4f0a6ddeedf274f0ba3e6ba66370173046c5df66

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0-cp38-cp38-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0-cp38-cp38-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 13293b3ccc0716757d48cbd186046e1abb95121687753565c26a65db899cd704
MD5 c951fb7c56826d274c2d28634d220e51
BLAKE2b-256 e59d1e0b9ded89f80d38be765363f4dd618d3ef436d5a3717a1f160bab160e7d

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0-cp37-cp37m-win_amd64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0-cp37-cp37m-win_amd64.whl
Algorithm Hash digest
SHA256 c61b414332e5c8ce1bf66cc9be0e33ae54830b045d9db5470b521deca1d0dabe
MD5 90f507d17a70eadc5968bd974448fcdd
BLAKE2b-256 aed193326bd4d3683f97fa98963c050c5ae82d4befcc72277465ffaff59f699f

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0-cp37-cp37m-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0-cp37-cp37m-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 2a418b37e7b03d4cdae5b90a8749213a971a5bcc2f6ec43f04cddaeaff9a1449
MD5 bb1dcdd3aa4844f3990396e98a968eb4
BLAKE2b-256 9a9e0667929b77fc0f1bbacc8dce147efc8ca618b105bc8b0110b4e4e920c89a

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0-cp37-cp37m-manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0-cp37-cp37m-manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 a664a65580123438cf55b14f24f35c6c72f4f590cec308003f62aa666f4e045d
MD5 7c28ce1f6cda4f52fbd806fa0c8ded6c
BLAKE2b-256 e354fc6b10791a980bcf76e8ae8b4085f1f45ac306deebf7e15347f1d5aa402c

See more details on using hashes here.

File details

Details for the file confluent_kafka-2.12.0-cp37-cp37m-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for confluent_kafka-2.12.0-cp37-cp37m-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 1d435845fcbc5285b9d9949d46518f858468ccde1b72524a6606d0f917e88484
MD5 42ccd0ab0745725ce8e8602e059a132e
BLAKE2b-256 c9ca50e66962fffab46a885d07da9280fd0f3e034fd099fcaaf081afd3768258

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page