A conversion tool for TensorFlow ANNs to CZModel
Project description
This project provides simple-to-use conversion tools to generate a CZModel file from a TensorFlow model that resides in memory or on disk to be usable in the ZEN Intellesis module starting with ZEN blue >=3.2 and ZEN Core >3.0.
This version of czmodel produces the following model version: 3.0.0
If you encounter a version mismatch when importing a model into ZEN, please check for the correct version of this package.
System setup
The current version of this toolbox only requires a fresh Python 3.x installation. It was tested with Python 3.7 on Windows.
Model conversion
The toolbox provides a convert
module that features all supported conversion strategies. It currently supports
converting Keras models in memory or stored on disk with a corresponding metadata JSON file.
Keras models in memory
The toolbox also provides functionality that can be imported e.g. in the training script used to fit a Keras model. The function is accessible by running:
from czmodel.convert import convert_from_model_spec
It accepts a tensorflow.keras.Model
that will be exported to SavedModel
format and at the same time wrapped into a CZModel file to be compatible with the Intellesis infrastructure.
To provide the meta data, the toolbox provides a ModelSpec class that must be filled with the model and a ModelMetadata
instance containing the required information described in the specification (see Model Metadata below)
file.
A CZModel can be created from a Keras model with the following three steps.
Creating a model meta data class
To export a CZModel file several meta information is needed that must be provided through a ModelMetadata
instance.
from czmodel.model_metadata import ModelMetadata
model_metadata = ModelMetadata.from_params(name='DNNModelFromKeras',
color_handling='ConvertToMonochrome',
pixel_type='Gray16',
classes=["Background", "Interesting Object", "Foreground"],
border_size=90,
license_file="C:\\some\\path\\to\\a\\LICENSE.txt")
Creating a model specification
The model and its corresponding metadata are now wrapped into a ModelSpec object.
from czmodel.model_metadata import ModelSpec
model_spec = ModelSpec(model=model, model_metadata=model_metadata)
Converting the model
The actual model conversion is finally performed with the ModelSpec object and the output path and name of the CZModel file.
from czmodel.convert import convert_from_model_spec
convert_from_model_spec(model_spec=model_spec, output_path='some/path', output_name='some_file_name')
Exported TensorFlow models
To convert an exported TensorFlow model the model and the provided meta data need to comply with (see ANN Model Specification below).
The actual conversion is triggered by either calling:
from czmodel.convert import convert_from_json_spec
convert_from_json_spec('Path to JSON file', 'Output path', 'Model Name')
or by using the command line interface of the savedmodel2czmodel
script:
savedmodel2czmodel path/to/model_spec.json output/path/ output_name
Addding pre-processing layers
Both, convert_from_json_spec
and convert_from_model_spec
additionally allow specifying the following optional parameters:
spatial_dims
: Set new spatial dimensions for the new input node of the model. This parameter is expected to contain the new height and width in that order. Note: The spatial input dimensions can only be changed in ANN architectures that are invariant to the spatial dimensions of the input, e.g. FCNs.preprocessing
: One or more pre-processing layers that will be prepended to the deployed model. A pre-processing layer must be derived from thetensorflow.keras.layers.Layer
class.
While ANN models are often trained on images in RGB(A) space, the ZEN infrastructure requires models inside a CZModel to expect inputs in BGR(A) color space. This toolbox offers pre-processing layers to convert the color space before passing the input to the model to be actually deployed. The followig code shows how to add a RGB to BGR conversion layer to a model and set its spatial input dimensions to 512x512.
from czmodel.util.preprocessing import RgbToBgr
# Define dimensions and pre-processing
spatial_dims = 512, 512 # Optional: Target spatial dimensions of the model
preprocessing = RgbToBgr() # Optional: Pre-Processing layers to be prepended to the model. Can be a list of layers.
# Perform conversion
convert_from_model_spec(model_spec=model_spec, output_path='some/path', output_name='some_file_name', spatial_dims=spatial_dims, preprocessing=preprocessing)
ANN Model Specification
This section specifies the requirements for an artificial neural network (ANN) model and the additionally required metadata to enable execution of the model inside the ZEN Intellesis infrastructure starting with ZEN blue >=3.2 and ZEN Core >3.0.
Core network structure and file format
To be usable in the SegmentationService infrastructure a neural network model must comply with the specified rules below.
- The model must be provided as a TensorFlow SavedModel.
- All operations in the contained execution graph must be supported by TensorFlow 2.0.0.
- The model currently must provide one input and one output node. Multiple inputs and outputs are not supported.
- The shape of the input node must have 4 dimensions where the first dimension specifies the batch size, the second and third dimensions specify the width and height of the expected input image and the third dimension represents the number of color channels.
- The batch dimension of the input node must be undefined or 1.
- The spatial dimension of the input image implicitly defines the maximum tile size of the model. Our infrastructure will ensure that all input images exactly match the specified dimensions. The spatial dimensions of the input node must be such that the model can be evaluated on the minimum required hardware (currently 8GB GPU memory) without running out of memory.
- The output node must have the same shape as the input node except for the last dimension that represents the class probabilities. The size of the last dimension of the output must be the number of classes. The values of the output tensor must represent the class probabilities for each pixel. I.e. values must lie in the [0...1] range and summing the output over the last dimension must produce an all-1 tensor (within numeric accuracy). Softmax activation can be used to turn logits into such probabilities.
- All types of pre-processing and post-processing (except the currently supported Conditional Random Field post-processing) e.g. normalization, standardization, down-sampling etc. must be included in the provided TensorFlow model so that no further action by the inference engine is needed before or after inference to obtain the expected results.
Model Metadata
Executing an ANN model within the Intellesis infrastructure requires additional meta information that needs to be provided along with the serialized model specified by the (see Core network structure and file format above). Meta information for the ANN model must be provided in a separate JSON file adhering to RFC8259 that must contain the following attributes:
- BorderSize (Type: int): For Intellesis models this attribute defines the size of the border that needs to be added to an input image such that there are no border effects visible in the required area of the generated segmentation mask. For deep architectures this value can be infeasibly large so that the border size must be defined in a way that the border effects are "acceptable" in the ANN model creator's opinion.
- ColorHandling (Type: string): Specifies how color (RGB and RGBA) pixel data are converted to one or more channels of scalar pixel data. Possible values are:
- ConvertToMonochrome (Converts color to gray scale)
- SplitRgb (Keeps the pixel representation in RGB space)
- PixelType (Type: string): The expected input type of the model. Possible values are:
- Gray8: 8 bit unsigned
- Gray16: 16 bit unsigned
- Gray32Float: 4 byte IEEE float
- Bgr24: 8 bit triples, representing the color channels Blue, Green and Red
- Bgr48: 16 bit triples, representing the color channels Blue, Green and Red
- Bgr96Float: Triple of 4 byte IEEE float, representing the color channels Blue, Green and Red
- Bgra32: 8 bit triples followed by an alpha (transparency) channel
- Gray64ComplexFloat: 2 x 4 byte IEEE float, representing real and imaginary part of a complex number
- Bgr192ComplexFloat: A triple of 2 x 4 byte IEEE float, representing real and imaginary part of a complex number, for the color channels Blue, Green and Red
- Classes (Type: array, Value type: string): A list of class names corresponding to the output dimensions of the predicted segmentation mask. If the last dimension of the prediction has shape n the provided list must be of length n.
- ModelPath (Type: string): The path to the exported neural network model. Can be absolute or relative to the JSON file.
The file may also contain the following optional attributes:
- TestImageFile (Type: string): The path to a test image in a format supported by ZEN. This image is used for basic validation of the converted model inside ZEN. Can be absolute or relative to the JSON file.
- LicenseFile (Type: string): The path to a license file that is added to the generated CZModel. Can be absolute or relative to the JSON file.
Json files can contain escape sequences and \-characters in paths must be escaped with \\.
The following code snippet shows an example for a valid metadata file:
{
"BorderSize": 90,
"ColorHandling": "ConvertToMonochrome",
"PixelType": "Gray16",
"Classes": ["Background", "Interesting Object", "Foreground"],
"ModelPath": "C:\\tf\\saved\\model\\folder\\",
"TestImageFile": "C:\\test-image.png",
"LicenseFile": "C:\\LICENSE.txt"
}
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file czmodel-0.1.3.tar.gz
.
File metadata
- Download URL: czmodel-0.1.3.tar.gz
- Upload date:
- Size: 17.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/46.4.0 requests-toolbelt/0.9.1 tqdm/4.46.0 CPython/3.6.8
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 932d336211bb18339e67d2d908a8ed1e8c387c87bf377ab9a293efb6cdf3fd84 |
|
MD5 | 3bebe1d5f1b145ff8df8cf1cd732eac8 |
|
BLAKE2b-256 | cb2697cc0ea8f4d00a6e2d1bb73b4976efd09e0cde958424bee02aed8efa9412 |
File details
Details for the file czmodel-0.1.3-py3-none-any.whl
.
File metadata
- Download URL: czmodel-0.1.3-py3-none-any.whl
- Upload date:
- Size: 21.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/46.4.0 requests-toolbelt/0.9.1 tqdm/4.46.0 CPython/3.6.8
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 9aec3e3fcef23ed0449d7a720bbf55d791e85e541694d5ddff30d61ab014f1a3 |
|
MD5 | 0c55f1cee7583e60862241c604fa5f54 |
|
BLAKE2b-256 | ccdf9a6443bfecfe04e4aa0a7a0858385de0cca59ef21b1d5456ac3947d2af6b |