Skip to main content

TensorFlow IO

Project description




TensorFlow I/O

GitHub CI PyPI License Documentation

TensorFlow I/O is a collection of file systems and file formats that are not available in TensorFlow's built-in support. A full list of supported file systems and file formats by TensorFlow I/O can be found here.

The use of tensorflow-io is straightforward with keras. Below is an example to Get Started with TensorFlow with the data processing aspect replaced by tensorflow-io:

import tensorflow as tf
import tensorflow_io as tfio

# Read the MNIST data into the IODataset.
dataset_url = "https://storage.googleapis.com/cvdf-datasets/mnist/"
d_train = tfio.IODataset.from_mnist(
    dataset_url + "train-images-idx3-ubyte.gz",
    dataset_url + "train-labels-idx1-ubyte.gz",
)

# Shuffle the elements of the dataset.
d_train = d_train.shuffle(buffer_size=1024)

# By default image data is uint8, so convert to float32 using map().
d_train = d_train.map(lambda x, y: (tf.image.convert_image_dtype(x, tf.float32), y))

# prepare batches the data just like any other tf.data.Dataset
d_train = d_train.batch(32)

# Build the model.
model = tf.keras.models.Sequential(
    [
        tf.keras.layers.Flatten(input_shape=(28, 28)),
        tf.keras.layers.Dense(512, activation=tf.nn.relu),
        tf.keras.layers.Dropout(0.2),
        tf.keras.layers.Dense(10, activation=tf.nn.softmax),
    ]
)

# Compile the model.
model.compile(
    optimizer="adam", loss="sparse_categorical_crossentropy", metrics=["accuracy"]
)

# Fit the model.
model.fit(d_train, epochs=5, steps_per_epoch=200)

In the above MNIST example, the URL's to access the dataset files are passed directly to the tfio.IODataset.from_mnist API call. This is due to the inherent support that tensorflow-io provides for HTTP/HTTPS file system, thus eliminating the need for downloading and saving datasets on a local directory.

NOTE: Since tensorflow-io is able to detect and uncompress the MNIST dataset automatically if needed, we can pass the URL's for the compressed files (gzip) to the API call as is.

Please check the official documentation for more detailed and interesting usages of the package.

Installation

Python Package

The tensorflow-io Python package can be installed with pip directly using:

$ pip install tensorflow-io

People who are a little more adventurous can also try our nightly binaries:

$ pip install tensorflow-io-nightly

Docker Images

In addition to the pip packages, the docker images can be used to quickly get started.

For stable builds:

$ docker pull tfsigio/tfio:latest
$ docker run -it --rm --name tfio-latest tfsigio/tfio:latest

For nightly builds:

$ docker pull tfsigio/tfio:nightly
$ docker run -it --rm --name tfio-nightly tfsigio/tfio:nightly

R Package

Once the tensorflow-io Python package has been successfully installed, you can install the development version of the R package from GitHub via the following:

if (!require("remotes")) install.packages("remotes")
remotes::install_github("tensorflow/io", subdir = "R-package")

TensorFlow Version Compatibility

To ensure compatibility with TensorFlow, it is recommended to install a matching version of TensorFlow I/O according to the table below. You can find the list of releases here.

TensorFlow I/O Version TensorFlow Compatibility Release Date
0.21.0 2.6.x Sep 12, 2021
0.20.0 2.6.x Aug 11, 2021
0.19.1 2.5.x Jul 25, 2021
0.19.0 2.5.x Jun 25, 2021
0.18.0 2.5.x May 13, 2021
0.17.1 2.4.x Apr 16, 2021
0.17.0 2.4.x Dec 14, 2020
0.16.0 2.3.x Oct 23, 2020
0.15.0 2.3.x Aug 03, 2020
0.14.0 2.2.x Jul 08, 2020
0.13.0 2.2.x May 10, 2020
0.12.0 2.1.x Feb 28, 2020
0.11.0 2.1.x Jan 10, 2020
0.10.0 2.0.x Dec 05, 2019
0.9.1 2.0.x Nov 15, 2019
0.9.0 2.0.x Oct 18, 2019
0.8.1 1.15.x Nov 15, 2019
0.8.0 1.15.x Oct 17, 2019
0.7.2 1.14.x Nov 15, 2019
0.7.1 1.14.x Oct 18, 2019
0.7.0 1.14.x Jul 14, 2019
0.6.0 1.13.x May 29, 2019
0.5.0 1.13.x Apr 12, 2019
0.4.0 1.13.x Mar 01, 2019
0.3.0 1.12.0 Feb 15, 2019
0.2.0 1.12.0 Jan 29, 2019
0.1.0 1.12.0 Dec 16, 2018

Performance Benchmarking

We use github-pages to document the results of API performance benchmarks. The benchmark job is triggered on every commit to master branch and facilitates tracking performance w.r.t commits.

Contributing

Tensorflow I/O is a community led open source project. As such, the project depends on public contributions, bug-fixes, and documentation. Please see:

Build Status and CI

Build Status
Linux CPU Python 2 Status
Linux CPU Python 3 Status
Linux GPU Python 2 Status
Linux GPU Python 3 Status

Because of manylinux2010 requirement, TensorFlow I/O is built with Ubuntu:16.04 + Developer Toolset 7 (GCC 7.3) on Linux. Configuration with Ubuntu 16.04 with Developer Toolset 7 is not exactly straightforward. If the system have docker installed, then the following command will automatically build manylinux2010 compatible whl package:

#!/usr/bin/env bash

ls dist/*
for f in dist/*.whl; do
  docker run -i --rm -v $PWD:/v -w /v --net=host quay.io/pypa/manylinux2010_x86_64 bash -x -e /v/tools/build/auditwheel repair --plat manylinux2010_x86_64 $f
done
sudo chown -R $(id -nu):$(id -ng) .
ls wheelhouse/*

It takes some time to build, but once complete, there will be python 3.5, 3.6, 3.7 compatible whl packages available in wheelhouse directory.

On macOS, the same command could be used. However, the script expects python in shell and will only generate a whl package that matches the version of python in shell. If you want to build a whl package for a specific python then you have to alias this version of python to python in shell. See .github/workflows/build.yml Auditwheel step for instructions how to do that.

Note the above command is also the command we use when releasing packages for Linux and macOS.

TensorFlow I/O uses both GitHub Workflows and Google CI (Kokoro) for continuous integration. GitHub Workflows is used for macOS build and test. Kokoro is used for Linux build and test. Again, because of the manylinux2010 requirement, on Linux whl packages are always built with Ubuntu 16.04 + Developer Toolset 7. Tests are done on a variatiy of systems with different python3 versions to ensure a good coverage:

Python Ubuntu 18.04 Ubuntu 20.04 macOS + osx9 Windows-2019
2.7 :heavy_check_mark: :heavy_check_mark: :heavy_check_mark: N/A
3.7 :heavy_check_mark: :heavy_check_mark: :heavy_check_mark: :heavy_check_mark:
3.8 :heavy_check_mark: :heavy_check_mark: :heavy_check_mark: :heavy_check_mark:

TensorFlow I/O has integrations with many systems and cloud vendors such as Prometheus, Apache Kafka, Apache Ignite, Google Cloud PubSub, AWS Kinesis, Microsoft Azure Storage, Alibaba Cloud OSS etc.

We tried our best to test against those systems in our continuous integration whenever possible. Some tests such as Prometheus, Kafka, and Ignite are done with live systems, meaning we install Prometheus/Kafka/Ignite on CI machine before the test is run. Some tests such as Kinesis, PubSub, and Azure Storage are done through official or non-official emulators. Offline tests are also performed whenever possible, though systems covered through offine tests may not have the same level of coverage as live systems or emulators.

Live System Emulator CI Integration Offline
Apache Kafka :heavy_check_mark: :heavy_check_mark:
Apache Ignite :heavy_check_mark: :heavy_check_mark:
Prometheus :heavy_check_mark: :heavy_check_mark:
Google PubSub :heavy_check_mark: :heavy_check_mark:
Azure Storage :heavy_check_mark: :heavy_check_mark:
AWS Kinesis :heavy_check_mark: :heavy_check_mark:
Alibaba Cloud OSS :heavy_check_mark:
Google BigTable/BigQuery to be added
Elasticsearch (experimental) :heavy_check_mark: :heavy_check_mark:
MongoDB (experimental) :heavy_check_mark: :heavy_check_mark:

References for emulators:

Community

Additional Information

License

Apache License 2.0

Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

File details

Details for the file tensorflow_io_gcs_filesystem_nightly-0.21.0.dev20211108060603-cp39-cp39-win_amd64.whl.

File metadata

File hashes

Hashes for tensorflow_io_gcs_filesystem_nightly-0.21.0.dev20211108060603-cp39-cp39-win_amd64.whl
Algorithm Hash digest
SHA256 8003dd9e509fef9d416aded9ce38b0506ce3604bc518d2ece2224abebd494f58
MD5 734fde909902dab42e820f41b931c016
BLAKE2b-256 91d55dfb1978ae675aabc9562a41e29e887f3f2b954edb0f8de5f460c53408cf

See more details on using hashes here.

File details

Details for the file tensorflow_io_gcs_filesystem_nightly-0.21.0.dev20211108060603-cp39-cp39-manylinux_2_12_x86_64.manylinux2010_x86_64.whl.

File metadata

File hashes

Hashes for tensorflow_io_gcs_filesystem_nightly-0.21.0.dev20211108060603-cp39-cp39-manylinux_2_12_x86_64.manylinux2010_x86_64.whl
Algorithm Hash digest
SHA256 52969b8850d3eb0c66ee32f2abce1714c846a0e2dd6ff5b17391b2110fe8b91d
MD5 f81aa942ab23747a6ec14bacc14e9839
BLAKE2b-256 0fd03e0c33d84321871a8b3fe2ef98f172c97cc4dba4bc312348223d819a9862

See more details on using hashes here.

File details

Details for the file tensorflow_io_gcs_filesystem_nightly-0.21.0.dev20211108060603-cp39-cp39-macosx_10_14_x86_64.whl.

File metadata

File hashes

Hashes for tensorflow_io_gcs_filesystem_nightly-0.21.0.dev20211108060603-cp39-cp39-macosx_10_14_x86_64.whl
Algorithm Hash digest
SHA256 73d5d2f2fe328ee949d6b4cd2dfa09da2a1917ee12eb65d84139327a43ed6666
MD5 3bca678d448cbc680447e82075c3d175
BLAKE2b-256 8c9725b0fb2e6f6e5e921bf42913cb9f199454fd2c0718e31107bcb9442612f4

See more details on using hashes here.

File details

Details for the file tensorflow_io_gcs_filesystem_nightly-0.21.0.dev20211108060603-cp38-cp38-win_amd64.whl.

File metadata

File hashes

Hashes for tensorflow_io_gcs_filesystem_nightly-0.21.0.dev20211108060603-cp38-cp38-win_amd64.whl
Algorithm Hash digest
SHA256 ce8fab58811ce39cf5abfe28ccf990cbe2711b7392d05970e3bd887fcc2ab81b
MD5 1106a645df7a2219b7abd7093defb5d4
BLAKE2b-256 9b7a5e558f65d2e447aea34f0a3f2f4432ee07313c9d2864eee31a63124d6927

See more details on using hashes here.

File details

Details for the file tensorflow_io_gcs_filesystem_nightly-0.21.0.dev20211108060603-cp38-cp38-manylinux_2_12_x86_64.manylinux2010_x86_64.whl.

File metadata

File hashes

Hashes for tensorflow_io_gcs_filesystem_nightly-0.21.0.dev20211108060603-cp38-cp38-manylinux_2_12_x86_64.manylinux2010_x86_64.whl
Algorithm Hash digest
SHA256 9450d29d8876e175a8d5016ce2e3759ead3b4dee9ce5762b6e944b1e06620779
MD5 cdf55222a49b0a8d95232a59477cffeb
BLAKE2b-256 843b3199fb22eaec8fe27bd02915dc84581dfabbf469d8a67e57a4985aa7ed37

See more details on using hashes here.

File details

Details for the file tensorflow_io_gcs_filesystem_nightly-0.21.0.dev20211108060603-cp38-cp38-macosx_10_14_x86_64.whl.

File metadata

File hashes

Hashes for tensorflow_io_gcs_filesystem_nightly-0.21.0.dev20211108060603-cp38-cp38-macosx_10_14_x86_64.whl
Algorithm Hash digest
SHA256 3798bb77c384c8939f8ea7f1783877538a92ecd7078f7492b0253a4373c0e57d
MD5 6999e76d972b8edb7ad5236d58d27a0c
BLAKE2b-256 3abca55d8dcebf9f9a12386430629fd5a5c9af258542300921127c28cd5cc464

See more details on using hashes here.

File details

Details for the file tensorflow_io_gcs_filesystem_nightly-0.21.0.dev20211108060603-cp37-cp37m-win_amd64.whl.

File metadata

File hashes

Hashes for tensorflow_io_gcs_filesystem_nightly-0.21.0.dev20211108060603-cp37-cp37m-win_amd64.whl
Algorithm Hash digest
SHA256 531aefe09e8aca4c6ac7d4cbcd3e192c425969eaf24e8fb74a0a58ec9de77f2f
MD5 87d297fdd38d511767af6567b15befac
BLAKE2b-256 97074256f44221a9ccd37e66d70f2bbad5939ea48eddf0f134ffaadb71dd3cde

See more details on using hashes here.

File details

Details for the file tensorflow_io_gcs_filesystem_nightly-0.21.0.dev20211108060603-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl.

File metadata

File hashes

Hashes for tensorflow_io_gcs_filesystem_nightly-0.21.0.dev20211108060603-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl
Algorithm Hash digest
SHA256 25583742c53706edcbea028ae145ed2be6d3cb4e29c952d7a4b350aa08624d12
MD5 3084393c484c48dd10b46f85b22ebae1
BLAKE2b-256 732d0d79b8f922ee07135d277a2f392858bef0c06d4a5281d30eb6d9aa69e297

See more details on using hashes here.

File details

Details for the file tensorflow_io_gcs_filesystem_nightly-0.21.0.dev20211108060603-cp37-cp37m-macosx_10_14_x86_64.whl.

File metadata

File hashes

Hashes for tensorflow_io_gcs_filesystem_nightly-0.21.0.dev20211108060603-cp37-cp37m-macosx_10_14_x86_64.whl
Algorithm Hash digest
SHA256 766ea383add024abf2c5078c2a08e1132248d85204cabad1fed0ebeeb5da2dba
MD5 dc4bcd09cb78d8c41df6bb1558f3c087
BLAKE2b-256 c17b3f92bae906a82491c134a0c43312a29516f8759000a4d970f10fabbed1da

See more details on using hashes here.

File details

Details for the file tensorflow_io_gcs_filesystem_nightly-0.21.0.dev20211108060603-cp36-cp36m-win_amd64.whl.

File metadata

File hashes

Hashes for tensorflow_io_gcs_filesystem_nightly-0.21.0.dev20211108060603-cp36-cp36m-win_amd64.whl
Algorithm Hash digest
SHA256 065798051f7dffa72f05de0f8de6cb3403d5b580e955b52731083c3adbe73415
MD5 ffd8c4397dc0be31d7392141e4a9b9a6
BLAKE2b-256 d01a010484fc94344066117a996b5e3392ea2c9a011709d4f700cca2cc258e86

See more details on using hashes here.

File details

Details for the file tensorflow_io_gcs_filesystem_nightly-0.21.0.dev20211108060603-cp36-cp36m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl.

File metadata

File hashes

Hashes for tensorflow_io_gcs_filesystem_nightly-0.21.0.dev20211108060603-cp36-cp36m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl
Algorithm Hash digest
SHA256 682d7e3b569e0d242372eb911444e8c21adc7e39e8560ca4f03cf38d58869811
MD5 96c7b4e57615bc14ba95bd67edd4343d
BLAKE2b-256 4e8648344888ca18085f174c4bacdae64dccf6914be7ead754e58d6e547134ac

See more details on using hashes here.

File details

Details for the file tensorflow_io_gcs_filesystem_nightly-0.21.0.dev20211108060603-cp36-cp36m-macosx_10_14_x86_64.whl.

File metadata

File hashes

Hashes for tensorflow_io_gcs_filesystem_nightly-0.21.0.dev20211108060603-cp36-cp36m-macosx_10_14_x86_64.whl
Algorithm Hash digest
SHA256 51ab91b6bd2612dbb2b312746f54bde1874f1ee6f42a783c8ec109989ed88b46
MD5 171973ce2dad7a81736655de58ccbdfb
BLAKE2b-256 247d89b7524319e2e503999cc332559be9ce22aa9e92655656e4f5d81db3e0bf

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page