Skip to main content

Kafka integration for IBM Streams topology applications

Project description

Overview

Provides functions to read messages from Kafka brokers including the IBM Event Streams cloud service as a stream and submit tuples to Kafka brokers as messages.

The broker configuration must be done with properties in an application configuration or by using a dictionary variable. The minimum set of properties must contain the bootstrap.servers configuration, which is valid for both consumers and producers, i.e. for the KafkaConsumer and KafkaProducer classes.

It is also possible to use different application configurations for consumer and producer when special consumer or producer configs must be used.

Sample

A simple hello world example of a Streams application publishing to a topic and the same application consuming the same topic:

from streamsx.topology.topology import Topology
from streamsx.topology.schema import CommonSchema
from streamsx.topology.context import submit, ContextTypes
from streamsx.kafka import KafkaConsumer, KafkaProducer
import time

def delay(v):
    time.sleep(5.0)
    return True

topology = Topology('KafkaHelloWorld')

to_kafka = topology.source(['Hello', 'World!'])
to_kafka = to_kafka.as_string()
# delay tuple by tuple
to_kafka = to_kafka.filter(delay)

# Publish a stream to Kafka using TEST topic, the Kafka server is at localhost
producer = KafkaProducer(config={'bootstrap.servers': 'localhost:9092'},
                         topic='TEST')
to_kafka.for_each(producer)

# Subscribe to same topic as a stream
consumer = KafkaConsumer(config={'bootstrap.servers': 'localhost:9092'},
                         schema=CommonSchema.String,
                         topic='TEST')
from_kafka = topology.source(consumer)

# You'll find the Hello World! in stdout log file:
from_kafka.print()

submit(ContextTypes.DISTRIBUTED, topology)
# The Streams job is kept running.

Documentation

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

streamsx.kafka-1.8.1.tar.gz (25.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

streamsx.kafka-1.8.1-py2.py3-none-any.whl (28.6 kB view details)

Uploaded Python 2Python 3

File details

Details for the file streamsx.kafka-1.8.1.tar.gz.

File metadata

  • Download URL: streamsx.kafka-1.8.1.tar.gz
  • Upload date:
  • Size: 25.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: Python-urllib/3.6

File hashes

Hashes for streamsx.kafka-1.8.1.tar.gz
Algorithm Hash digest
SHA256 d3714cfe6f9d24e5df940666413a1a0f0bddf0a150d60b10d3b1ee828b5af501
MD5 410698d080cc2c559f55e85017d9c8d5
BLAKE2b-256 00676079c49051b9e7f5c1b1ebe93404c4c1f57bc40035847cd18ccbe65be535

See more details on using hashes here.

File details

Details for the file streamsx.kafka-1.8.1-py2.py3-none-any.whl.

File metadata

File hashes

Hashes for streamsx.kafka-1.8.1-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 c1a873a9317552dc1a333a32fc917734d5b4aba517a052a009fca058cc708af6
MD5 94b227dabbd6d66b43ac80c3d7db59cd
BLAKE2b-256 f28e9519b6e67a5eeed8b36d575aac80b3239ecb6251f6177ac989688fef9912

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page