Skip to main content

IBM Streams HDFS integration

Project description


Provides functions to access files on HDFS. For example, connect to IBM Analytics Engine on IBM Cloud.

This package exposes the toolkit as Python methods for use with Streaming Analytics service on IBM Cloud and IBM Streams including IBM Cloud Pak for Data.


A simple hello world example of a Streams application writing string messages to a file to HDFS. Scan for created file on HDFS and read the content:

from streamsx.topology.topology import *
from streamsx.topology.schema import CommonSchema, StreamSchema
from streamsx.topology.context import submit
import streamsx.hdfs as hdfs

credentials = json.load(credentials_analytics_engine_service)

topo = Topology('HDFSHelloWorld')

to_hdfs = topo.source(['Hello', 'World!'])
to_hdfs = to_hdfs.as_string()

# Write a stream to HDFS
hdfs.write(to_hdfs, credentials=credentials, file='/sample/hw.txt')

scanned = hdfs.scan(topo, credentials=credentials, directory='/sample', init_delay=10)

# read text file line by line
r =, credentials=credentials)

# print each line (tuple)

# Use for IBM Streams including IBM Cloud Pak for Data
# submit ('DISTRIBUTED', topo)

Project details

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Files for streamsx.hdfs, version 1.4.0
Filename, size File type Python version Upload date Hashes
Filename, size streamsx.hdfs-1.4.0-py2.py3-none-any.whl (14.7 kB) File type Wheel Python version 3.6 Upload date Hashes View hashes
Filename, size streamsx.hdfs-1.4.0.tar.gz (8.6 kB) File type Source Python version None Upload date Hashes View hashes

Supported by

Elastic Elastic Search Pingdom Pingdom Monitoring Google Google BigQuery Sentry Sentry Error logging AWS AWS Cloud computing DataDog DataDog Monitoring Fastly Fastly CDN SignalFx SignalFx Supporter DigiCert DigiCert EV certificate StatusPage StatusPage Status page