Skip to main content

Pure Python client for Apache Kafka

Project description

https://img.shields.io/badge/kafka-0.11%2C%200.10%2C%200.9%2C%200.8-brightgreen.svg https://img.shields.io/pypi/pyversions/kafka-python.svg https://coveralls.io/repos/dpkp/kafka-python/badge.svg?branch=master&service=github https://travis-ci.org/dpkp/kafka-python.svg?branch=master https://img.shields.io/badge/license-Apache%202-blue.svg

Python client for the Apache Kafka distributed stream processing system. kafka-python is designed to function much like the official java client, with a sprinkling of pythonic interfaces (e.g., consumer iterators).

kafka-python is best used with newer brokers (0.9+), but is backwards-compatible with older versions (to 0.8.0). Some features will only be enabled on newer brokers. For example, fully coordinated consumer groups – i.e., dynamic partition assignment to multiple consumers in the same group – requires use of 0.9+ kafka brokers. Supporting this feature for earlier broker releases would require writing and maintaining custom leadership election and membership / health check code (perhaps using zookeeper or consul). For older brokers, you can achieve something similar by manually assigning different partitions to each consumer instance with config management tools like chef, ansible, etc. This approach will work fine, though it does not support rebalancing on failures. See <https://kafka-python.readthedocs.io/en/master/compatibility.html> for more details.

Please note that the master branch may contain unreleased features. For release documentation, please see readthedocs and/or python’s inline help.

>>> pip install kafka-python

KafkaConsumer

KafkaConsumer is a high-level message consumer, intended to operate as similarly as possible to the official java client. Full support for coordinated consumer groups requires use of kafka brokers that support the Group APIs: kafka v0.9+.

See <https://kafka-python.readthedocs.io/en/master/apidoc/KafkaConsumer.html> for API and configuration details.

The consumer iterator returns ConsumerRecords, which are simple namedtuples that expose basic message attributes: topic, partition, offset, key, and value:

>>> from kafka import KafkaConsumer
>>> consumer = KafkaConsumer('my_favorite_topic')
>>> for msg in consumer:
...     print (msg)
>>> # join a consumer group for dynamic partition assignment and offset commits
>>> from kafka import KafkaConsumer
>>> consumer = KafkaConsumer('my_favorite_topic', group_id='my_favorite_group')
>>> for msg in consumer:
...     print (msg)
>>> # manually assign the partition list for the consumer
>>> from kafka import TopicPartition
>>> consumer = KafkaConsumer(bootstrap_servers='localhost:1234')
>>> consumer.assign([TopicPartition('foobar', 2)])
>>> msg = next(consumer)
>>> # Deserialize msgpack-encoded values
>>> consumer = KafkaConsumer(value_deserializer=msgpack.loads)
>>> consumer.subscribe(['msgpackfoo'])
>>> for msg in consumer:
...     assert isinstance(msg.value, dict)

KafkaProducer

KafkaProducer is a high-level, asynchronous message producer. The class is intended to operate as similarly as possible to the official java client. See <https://kafka-python.readthedocs.io/en/master/apidoc/KafkaProducer.html> for more details.

>>> from kafka import KafkaProducer
>>> producer = KafkaProducer(bootstrap_servers='localhost:1234')
>>> for _ in range(100):
...     producer.send('foobar', b'some_message_bytes')
>>> # Block until a single message is sent (or timeout)
>>> future = producer.send('foobar', b'another_message')
>>> result = future.get(timeout=60)
>>> # Block until all pending messages are at least put on the network
>>> # NOTE: This does not guarantee delivery or success! It is really
>>> # only useful if you configure internal batching using linger_ms
>>> producer.flush()
>>> # Use a key for hashed-partitioning
>>> producer.send('foobar', key=b'foo', value=b'bar')
>>> # Serialize json messages
>>> import json
>>> producer = KafkaProducer(value_serializer=lambda v: json.dumps(v).encode('utf-8'))
>>> producer.send('fizzbuzz', {'foo': 'bar'})
>>> # Serialize string keys
>>> producer = KafkaProducer(key_serializer=str.encode)
>>> producer.send('flipflap', key='ping', value=b'1234')
>>> # Compress messages
>>> producer = KafkaProducer(compression_type='gzip')
>>> for i in range(1000):
...     producer.send('foobar', b'msg %d' % i)

Thread safety

The KafkaProducer can be used across threads without issue, unlike the KafkaConsumer which cannot.

While it is possible to use the KafkaConsumer in a thread-local manner, multiprocessing is recommended.

Compression

kafka-python supports gzip compression/decompression natively. To produce or consume lz4 compressed messages, you should install python-lz4 (pip install lz4). To enable snappy compression/decompression install python-snappy (also requires snappy library). See <https://kafka-python.readthedocs.io/en/master/install.html#optional-snappy-install> for more information.

Protocol

A secondary goal of kafka-python is to provide an easy-to-use protocol layer for interacting with kafka brokers via the python repl. This is useful for testing, probing, and general experimentation. The protocol support is leveraged to enable a KafkaClient.check_version() method that probes a kafka broker and attempts to identify which version it is running (0.8.0 to 0.11).

Low-level

Legacy support is maintained for low-level consumer and producer classes, SimpleConsumer and SimpleProducer. See <https://kafka-python.readthedocs.io/en/master/simple.html?highlight=SimpleProducer> for API details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

kafka-python-1.3.5.tar.gz (227.5 kB view details)

Uploaded Source

Built Distribution

kafka_python-1.3.5-py2.py3-none-any.whl (207.3 kB view details)

Uploaded Python 2Python 3

File details

Details for the file kafka-python-1.3.5.tar.gz.

File metadata

  • Download URL: kafka-python-1.3.5.tar.gz
  • Upload date:
  • Size: 227.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No

File hashes

Hashes for kafka-python-1.3.5.tar.gz
Algorithm Hash digest
SHA256 b238d054d744299483c70c4e3e029ee09ff8327a87730ee0cbcfe23e5973a9a6
MD5 0f39ada74cfc4c25402e689f41c33760
BLAKE2b-256 1fa59331c3e83fb930259ae32b8dfc39dee1ac8a3027f55b0e724371e0ad1f82

See more details on using hashes here.

File details

Details for the file kafka_python-1.3.5-py2.py3-none-any.whl.

File metadata

File hashes

Hashes for kafka_python-1.3.5-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 855105b5729e2d491eb3db9d79a634bd307d1efcfe0fd6398c0279f670a84f83
MD5 3f7df6b6ddbe0138da48eb8c7aec20c2
BLAKE2b-256 4505ad449f64ce4cbcfae86d908d1af98a86116011ddee81d5d76d49e48604d3

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page