Skip to main content

Kafka integration for IBM Streams topology applications

Project description

Overview

Provides functions to read messages from Kafka brokers including the IBM Event Streams cloud service as a stream and submit tuples to Kafka brokers as messages.

The broker configuration must be done with properties in an application configuration or by using a dictionary variable. The minimum set of properties must contain the bootstrap.servers configuration, which is valid for both consumers and producers, i.e. for the KafkaConsumer and KafkaProducer classes.

It is also possible to use different application configurations for consumer and producer when special consumer or producer configs must be used.

Sample

A simple hello world example of a Streams application publishing to a topic and the same application consuming the same topic:

from streamsx.topology.topology import Topology
from streamsx.topology.schema import CommonSchema
from streamsx.topology.context import submit, ContextTypes
from streamsx.kafka import KafkaConsumer, KafkaProducer
import time

def delay(v):
    time.sleep(5.0)
    return True

topology = Topology('KafkaHelloWorld')

to_kafka = topology.source(['Hello', 'World!'])
to_kafka = to_kafka.as_string()
# delay tuple by tuple
to_kafka = to_kafka.filter(delay)

# Publish a stream to Kafka using TEST topic, the Kafka server is at localhost
producer = KafkaProducer(config={'bootstrap.servers': 'localhost:9092'},
                         topic='TEST')
to_kafka.for_each(producer)

# Subscribe to same topic as a stream
consumer = KafkaConsumer(config={'bootstrap.servers': 'localhost:9092'},
                         schema=CommonSchema.String,
                         topic='TEST')
from_kafka = topology.source(consumer)

# You'll find the Hello World! in stdout log file:
from_kafka.print()

submit(ContextTypes.DISTRIBUTED, topology)
# The Streams job is kept running.

Documentation

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

streamsx.kafka-1.10.1.tar.gz (27.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

streamsx.kafka-1.10.1-py2.py3-none-any.whl (31.3 kB view details)

Uploaded Python 2Python 3

File details

Details for the file streamsx.kafka-1.10.1.tar.gz.

File metadata

  • Download URL: streamsx.kafka-1.10.1.tar.gz
  • Upload date:
  • Size: 27.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: Python-urllib/3.6

File hashes

Hashes for streamsx.kafka-1.10.1.tar.gz
Algorithm Hash digest
SHA256 b78949b5567016c6a08a7b649871243f4d3887ecfef405600d729d2b41511375
MD5 4aa689f8efa97a65d4393013f538f913
BLAKE2b-256 a0a65a5edeb684210f123e0acf1eb2a3b3528f7434c02d33beabbea59a0b0f58

See more details on using hashes here.

File details

Details for the file streamsx.kafka-1.10.1-py2.py3-none-any.whl.

File metadata

File hashes

Hashes for streamsx.kafka-1.10.1-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 f7b0927390524af568b91ead1d10a1f9d37728f5421cc7e65e1d6f55664ed27a
MD5 0dd40193399654314198ad9a601c345d
BLAKE2b-256 5a22bfd22068db33d0712738806efd070eb3eaf8379284481dd2053440cd6171

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page