Skip to main content

Package to extract binary files into pandas dataframes

Project description

RPH extraction

Contains a tool to read a .rph file into a RphData structure.

Usage

A simple example is given below:

from AmiAutomation import RphData


# Read from a rph file to a RphData object
rphData = RphData.rphToDf(path = "path_to_rph_file")


# Table data inside a dataframe
dataframe = rphData.dataFrame
# Timestamps are always returned in RPH's local time with timezone info
# eg datetime '2023-11-24 15:30:00-04:00'

# To convert Timestamps to utc it is suggested to use the pandas "to_datetime" method:

import pandas as pd

dataframe.Timestamp = pd.to_datetime(dataframe.Timestamp, utc=True)

Binaries extraction

This package contains the tools to easily extract binary data from PX3's:

  • Heat Log
  • 2 Second Log
  • Wave Log
  • Composite
  • Histogram

Into a pandas dataframe for further processing

Usage

Importing a function is done the same way as any python package:

from AmiAutomation import PX3_Bin, LogData

From there you can call a method with the module prefix:

dataFrame = PX3_Bin.file_to_df(path = "C:\\Binaries")

or

dataFrame = LogData.binFileToDF(path = "C:\\Binaries")

LogData Methods

You can get Binary log data in a LogData format that contains useful data about the binary file, including samples inside a pandas dataframe


LogData.binFileToDF

Unpacks binary file into LogData

  • Parameters:

    • path : str

      Complete file path

    • extension : str, optional

      Explicitly enforce file extension. Value must be "bin", "cpst" or "hist"
      If no value is given, the extension will be infered from the file name if it cannot be infered, the default value will be used instead

      Default Value: bin

    • null_promoting : dict, optional

      A dictionary with a .NET Source Type key and a value of either one of the following (default, object, float, Int64, string, error).

      The possible dictionary keys are the .NET simple types:

      • "SByte" : Signed Byte
      • "Byte" : Unsigned Byte
      • "Int16" : 16 bit integer
      • "UInt16" : 16 bit unsigned integer
      • "Int32" : 32 bit integer
      • "UInt32" : 32 bit unsigned integer
      • "Int64" : 64 bit integer
      • "UInt64" : 64 bit unsigned integer
      • "Char" : Character
      • "Single" : Floating point single precision
      • "Double" : Floating point double precision
      • "Boolean" : bit
      • "Decimal" : 16 byte decimal precision
      • "DateTime" : Date time

      This dictionary values determines how null values in deserialization affect the resulting LogData dataframe column:

      • "default" : use pandas automatic inference when dealing with null values on a column
      • "object" : The returned type is the generic python object type
      • "float" : The returned type is the python float type
      • "Int64" : The returned type is the pandas Nullable Integer Int64 type
      • "string" : Values are returned as strings
      • "error" : Raises and exception when null values are encountered

      Default value: None

  • Returns:

    • LogData :
      Structure containing most file data

LogData.binStreamToDF

Unpacks binary file stream into LogData

  • Parameters:

    • file : stream

      A python IOStream of the binary file

    • extension : str, optional

      Explicitly enforce file extension. Value must be "bin", "cpst" or "hist"

      Default: bin

    • null_promoting : dict, optional
      Same as in LogData.binFileToDF. A dictionary with a .NET Source Type key and a value of either one of the following (default, object, float, Int64, string, error).

  • Returns:

    • LogData :
      Structure containing most file data

Examples

Simple file conversion

from AmiAutomation import LogData


# Execute the conversion from source file
logData = LogData.binFileToDF("bin_file_path.bin")


# To access samples just access the dataframe inside the LogData object
dataFrame = logData.dataFrame 

Conversion from an IO Stream

from AmiAutomation import LogData


# Get the file stream
file_stream = open(file_path, "rb")


# Execute the conversion from stream
logData = LogData.binStreamToDF(file_stream)


# Access the dataframe inside the LogData object
dataFrame = logData.dataFrame 

Conversion of a file without extension

from AmiAutomation import LogData


# Perform the conversion, explicitly using a file extension
logData = LogData.binFileToDF("file_path", extension="rph" )


# Access the dataframe inside the LogData object
dataFrame = logData.dataFrame 

Conversion with null promoting

from AmiAutomation import LogData


# Adding null promoting to handle missing values in these types of data as object
logData = LogData.binFileToDF("bin_file_path.bin", null_promoting={"Int32":"object", "Int16":"object", "Int64":"object"})


# Access the dataframe inside the LogData object
dataFrame = logData.dataFrame 

This method can also be used to retrive the data table from inside a ".cpst" or ".hist" file, detection is automatic based on file extension in the filename, if none is given, a warning will be issued and ".bin" is assumed


LogData Object

The log data object contains some properties metadata from the file read as well as the logged data inside a Pandas Dataframe

The data is structured as follows:

  • properties : dict
    A Dictionary containing some metadata from each file, it changes depending the type of file read :

    • Bin File:
      Key Type Value
      Definition str Xml string with the table definition of contained data
      Version int file compresion version
      Name str file type name
      StartTime datetime first sample record time
      Increment timedelta time between samples
      Duration float Total logged time in seconds

    • Cpst File:
      Key Type Value
      Name str file type name
      FurnaceId int Furnace Id Number

    • Hist File:
      Key Type Value
      Name str file type name
      HeatId int Heat Id Number
      ModelVersion int File Model Version
      Sequence int Sequence number
      Timestamp datetime File Timestamp

  • dataFrame : DataFrame
    A pandas.core.frame.DataFrame containing logged data


PX3_Bin Methods

This method returns a single pandas dataframe containing extracted data from the provided file, path or path with constrained dates

  • file_to_df ( path, file, start_time, end_time, verbose = False )

  • To process a single file you need to provide the absolute path in the file argument

dataFrame = PX3_Bin.file_to_df(file = "C:\\Binaries\\20240403T002821Z$-4038953271967.bin")
  • To process several files just provide the directory path where the binaries are (binaries inside sub-directories are also included)
dataFrame = PX3_Bin.file_to_df(path = "C:\\Binaries\\")
  • You can constrain the binaries inside a directory (and sub-directories) by also providing a start-date or both a start date and end date as a python datetime.datetime object
import datetime

time = datetime.datetime(2020,2,15,13,30) # February 15th 2020, 1:30 PM

### This returns ALL the data available in the path from the given date to the actual time
dataFrame = PX3_Bin.file_to_df(path = "C:\\Binaries\\", start_time=time)
import datetime

time_start = datetime.datetime(2020,2,15,13,30) # February 15th 2020, 1:30 PM
time_end = datetime.datetime(2020,2,15,13,45) # February 15th 2020, 1:45 PM

### This returns all the data available in the path from the given 15 minutes
dataFrame = PX3_Bin.file_to_df(path = "C:\\Binaries\\", start_time=time_start, end_time=time_end )

Tested with packages version

  • pythonnet 2.5.1
  • pandas 1.1.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

AmiAutomation-0.1.5.3.tar.gz (121.4 kB view details)

Uploaded Source

Built Distribution

AmiAutomation-0.1.5.3-py3-none-any.whl (119.4 kB view details)

Uploaded Python 3

File details

Details for the file AmiAutomation-0.1.5.3.tar.gz.

File metadata

  • Download URL: AmiAutomation-0.1.5.3.tar.gz
  • Upload date:
  • Size: 121.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.2 importlib_metadata/4.6.3 pkginfo/1.7.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.0 CPython/3.9.6

File hashes

Hashes for AmiAutomation-0.1.5.3.tar.gz
Algorithm Hash digest
SHA256 347494857f3427bcb43d57e5233f1ab885dc5b4e10ae17e96c6351938262ce4e
MD5 c8e28523024e7ff36d6ca483953cbf1a
BLAKE2b-256 e3eeeb675518b594018ff53c6db1cf6f13db1c2401542a8aaa270f75c1e99312

See more details on using hashes here.

File details

Details for the file AmiAutomation-0.1.5.3-py3-none-any.whl.

File metadata

  • Download URL: AmiAutomation-0.1.5.3-py3-none-any.whl
  • Upload date:
  • Size: 119.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.2 importlib_metadata/4.6.3 pkginfo/1.7.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.0 CPython/3.9.6

File hashes

Hashes for AmiAutomation-0.1.5.3-py3-none-any.whl
Algorithm Hash digest
SHA256 7c41f8b490f7883aa38ddeb626de7e283f679cd9737d04c12a2f184d37cbc264
MD5 7faf4a16d6f34361bbac4b8a8563f3fc
BLAKE2b-256 4d6d930256ebbc44e2b4ce941d7874d1cddc2942d7f768b1ffd2c7452ca73f2f

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page