Skip to main content

Kafka integration for IBM Streams topology applications

Project description

Overview

Provides functions to read messages from Kafka brokers including the IBM Event Streams cloud service as a stream and submit tuples to Kafka brokers as messages.

The broker configuration must be done with properties in an application configuration or by using a dictionary variable. The minimum set of properties must contain the bootstrap.servers configuration, which is valid for both consumers and producers, i.e. for the KafkaConsumer and KafkaProducer classes.

It is also possible to use different application configurations for consumer and producer when special consumer or producer configs must be used.

Sample

A simple hello world example of a Streams application publishing to a topic and the same application consuming the same topic:

from streamsx.topology.topology import Topology
from streamsx.topology.schema import CommonSchema
from streamsx.topology.context import submit, ContextTypes
from streamsx.kafka import KafkaConsumer, KafkaProducer
import time

def delay(v):
    time.sleep(5.0)
    return True

topology = Topology('KafkaHelloWorld')

to_kafka = topology.source(['Hello', 'World!'])
to_kafka = to_kafka.as_string()
# delay tuple by tuple
to_kafka = to_kafka.filter(delay)

# Publish a stream to Kafka using TEST topic, the Kafka server is at localhost
producer = KafkaProducer(config={'bootstrap.servers': 'localhost:9092'},
                         topic='TEST')
to_kafka.for_each(producer)

# Subscribe to same topic as a stream
consumer = KafkaConsumer(config={'bootstrap.servers': 'localhost:9092'},
                         schema=CommonSchema.String,
                         topic='TEST')
from_kafka = topology.source(consumer)

# You'll find the Hello World! in stdout log file:
from_kafka.print()

submit(ContextTypes.DISTRIBUTED, topology)
# The Streams job is kept running.

Documentation

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

streamsx.kafka-1.10.2.tar.gz (26.6 kB view details)

Uploaded Source

Built Distribution

streamsx.kafka-1.10.2-py2.py3-none-any.whl (29.3 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file streamsx.kafka-1.10.2.tar.gz.

File metadata

  • Download URL: streamsx.kafka-1.10.2.tar.gz
  • Upload date:
  • Size: 26.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: Python-urllib/3.6

File hashes

Hashes for streamsx.kafka-1.10.2.tar.gz
Algorithm Hash digest
SHA256 89ba7ab8d58e605fd8be03149b060b73c9ee6925e856229259cba704bba8136a
MD5 cab2931fe93f0747bf63446662e64284
BLAKE2b-256 f9f2a5a5cdf7c470b76d4759018698a6d8b27cb89e83d4804af403fbd30b1907

See more details on using hashes here.

File details

Details for the file streamsx.kafka-1.10.2-py2.py3-none-any.whl.

File metadata

File hashes

Hashes for streamsx.kafka-1.10.2-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 5d4af53154be844010172d5c28387e92a8aab85f234eb164f9328405333bf78e
MD5 8455753fbf459e05f751836119b63e74
BLAKE2b-256 65023883b80e7920bb9d220656f94ea3406f62d9722de7ee7a3f4dc1b6452b5e

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page