Skip to main content

Converter pandas to tfrecords & tfrecords to pandas

Project description

This project was born under impression from spark-tensorflow-connector and implements similar functionality in order to save easy pandas dataframe to tfrecords and to restore tfrecords to pandas dataframe.

It can work as with local files as with AWS S3 files. Please keep in mind, that here tensorflow works with local copies of remote files, which are synced via s3fs with S3. I did this workaround because my tensorflow v2.1.0 didn’t work with S3 directly and raised authentication error Credentials have expired attempting to repull from EC2 Metadata Service, but maybe it’s fixed already.

Quick start

pip install pandas-tfrecords
import pandas as pd
from pandas_tfrecords import pd2tf, tf2pd

df = pd.DataFrame({'A': [1, 2, 3], 'B': ['a', 'b', 'c'], 'C': [[1, 2], [3, 4], [5, 6]]})

# local
pd2tf(df, './tfrecords')
my_df = tf2pd('./tfrecords')

# S3
pd2tf(df, 's3://my-bucket/tfrecords')
my_df = tf2pd('s3://my-bucket/tfrecords')

boolean support (v0.1.6+):

import pandas as pd
from pandas_tfrecords import pd2tf, tf2pd

df = pd.DataFrame({'A': [True, False, True], 'B': ['a', 'b', 'c'], 'C': [[1, False], [3, True], [5, False]]})

pd2tf(df, './tfrecords')

my_df = tf2pd('./tfrecords', schema={'A': bool, 'B': str, 'C': [int, bool]})

# or if needs 0/1 instead of True/False
my_df = tf2pd('./tfrecords', schema={'A': int, 'B': str, 'C': [int, int]})

# or just skip schema, int used by default if bool isn't specified
my_df = tf2pd('./tfrecords')

Converted types

pandas -> tfrecords

bytes, str -> tf.string
int, np.integer, bool, np.bool_ -> tf.int64
float, np.floating -> tf.float32
list, np.ndarray of bytes, str, int, np.integer, bool, np.bool_, float, np.floating -> sequence of tf.string, tf.int64, tf.float32

tfrecords -> pandas

tf.string -> bytes
tf.int64 -> int, bool
tf.float32 -> float
sequence of tf.string, tf.int64, tf.float32 -> list of bytes, int, bool, float

NB! Please pay attention it works only with one-dimentional arrays. It means [1, 2, 3] will be converted to both sides, but [[1,2,3]] won’t be converted to any side. It works that, because spark-tensorflow-connector works similar, and I didn’t learn yet how to implement nested sequences. In order to work with nested sequences you should use reshape.

API

pandas_tfrecords.pandas_to_tfrecords(df, folder, compression_type='GZIP', compression_level=9, columns=None, max_mb=50)

Arguments:

  • df - pandas dataframe. Please keep in mind above info about nested sequences.

  • folder - folder to save tfrecords, local or S3. Please be sure that it doesn’t contain other files or folders, if you want to read from this folder then.

  • compression_type='GZIP' - compression types: 'GZIP', 'ZLIB', None. If None not compressed.

  • compression_level=9 - compression level 0…9.

  • columns=None - list of columns to save, if None all columns will be saved.

  • max_mb=50 - maximum size of uncompressed data to save in single file. If dataframe total size is bigger than this limit, then it will be splitted to several files. If None it isn’t limited and single file will be saved.

alias pandas_tfrecords.pd2tf

pandas_tfrecords.tfrecords_to_pandas(file_paths, schema=None, compression_type='auto', cast=True)

Arguments:

  • file_paths - One or sequence of file paths or folders, local or S3, to read tfrecords from.

  • schema=None - If None schema will be detected automatically. But you can specify which columns you want to read only. It should be a dict, which keys are column names and values are column data types: str (or bytes), int, float, and for sequences it should be wrapped to list: [str] (or [bytes]), [int], [float]. For example:

df = pd.DataFrame({'A': [1, 2, 3], 'B': ['a', 'b', 'c'], 'C': [[1, 2], [3, 4], [5, 6]]})
print(df)
   A  B       C
0  1  a  [1, 2]
1  2  b  [3, 4]
2  3  c  [5, 6]

pd2tf(df, './tfrecords')
tf2pd('./tfrecords', schema={'A': int, 'C': [int]})
   A       C
0  1  [1, 2]
1  2  [3, 4]
2  3  [5, 6]
  • compression_type='auto' - compression type: 'auto', 'GZIP', 'ZLIB', None.

  • cast=True - if True it casts bytes data after converting from tf.string. It tries to cast it to int, float and str sequentially. If it’s not possible, otherwise keeps as is.

alias pandas_tfrecords.tf2pd

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pandas-tfrecords-0.1.6.tar.gz (6.0 kB view details)

Uploaded Source

Built Distribution

pandas_tfrecords-0.1.6-py3-none-any.whl (7.2 kB view details)

Uploaded Python 3

File details

Details for the file pandas-tfrecords-0.1.6.tar.gz.

File metadata

  • Download URL: pandas-tfrecords-0.1.6.tar.gz
  • Upload date:
  • Size: 6.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.5.0 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.61.0 CPython/3.7.2

File hashes

Hashes for pandas-tfrecords-0.1.6.tar.gz
Algorithm Hash digest
SHA256 fff2059fa8da6cc54472731a4fc83535943477883ed1da14dabb8bc92f849b11
MD5 09fa7c7cec4a045d4c3e899cb9f21750
BLAKE2b-256 b007a6792408d01d3a821b7972df0fd20b0bc2102e92c3688f5b423b9250e18b

See more details on using hashes here.

File details

Details for the file pandas_tfrecords-0.1.6-py3-none-any.whl.

File metadata

  • Download URL: pandas_tfrecords-0.1.6-py3-none-any.whl
  • Upload date:
  • Size: 7.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.5.0 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.61.0 CPython/3.7.2

File hashes

Hashes for pandas_tfrecords-0.1.6-py3-none-any.whl
Algorithm Hash digest
SHA256 a9dba59136e284121954f258b4cd58112649a6c263ea4b15d62fdd015b3d388d
MD5 e6c2ffac4f0e35f99162f79de9c7ec6e
BLAKE2b-256 8d0ff137913b95dfb14abf9f541abd64448d72208801f6770bde8da7bc9f33a7

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page