TensorSpace.js Python tool package
Project description
TensorSpace Converter
English | 中文
TensorSpace-Converter is a tool used to generate a TensorSpace compatible model from a pre-trained model built by TensorFlow, Keras and TensorFlow.js. TensorSpace-Converter includes the functions of: extracting information from hidden layers, matching intermediate data based on the configurations and exporting preprocessed TensorSpace compatible model. TensorSpace simplifies the preprocess and helps developers to focus on the development of model visualization.
Table of Content
Motivation
TensorSpace is a JavaScript framework used to 3D visualize pre-trained deep learning models built by TensorFlow, Keras and TensorFlow.js. Before applying TensorSpace to the pre-trained model, there is an important pipeline - TensorSpace model preprocessing ( Checkout this article for more information about TensorSpace preprocessing ). TensorSpace-Converter is designed to simplify the model preprocessing and generate a TensorSpace compatible model easily and quickly.
Without TensorSpace-Converter, the developer needs to be expert on the pre-trained model and machine learning library the model used. For example, if the developer has an LeNet pre-trained model built by Keras, it is required to know the structure of the LeNet network as well as how to implement a new multi-output model by Keras. Now, with TensorSpace-Converter, it only needs some commands to complete the preprocessing process. For example, the developer only needs to use the commands to preprocess a Keras pre-trained model.
As a component of TensorSpace ecosystem, TensorSpace-Converter simplifies the TensorSpace preprocess, release the workloads from learning how to generate TensorSpace compatible model. As a development tool, TensorSpace-Converter helps to separate the work of model training
and model visuailization
.
Fig. 1 - TensorSpace LeNet
Fig. 2 - TensorSpace Preprocess Concept
Getting Started
Install
Install the tensorspacejs pip package:
$ pip install tensorspacejs
If tensorspacejs
is installed successfully, you can check the TensorSpace-Converter version by using the command:
$ tensorspacejs_converter -v
- Note
TensorSpace-Converter requires to run under Python 3.6, Node 11.3+. If you have other pre-installed Python version in your local environment, we suggest you to create a new virtual environment. For example, the conda commands is like:
$ conda create -n envname python=3.6
$ source activate envname
$ pip install tensorspacejs
Usage
The following part introduces the usage and workflow on:
- how to use TensorSpace-Converter to convert a pre-trained model;
- how to apply TensorSpace to the converted model for model visualization.
An MNIST-digit tf.keras model is used as an example in the tutorial. The sample files used in the tutorial includes pre-trained tf.keras model, TensorSpace-Converter script and TensorSpace visualization code.
Step 1: Use TensorSpace-Converter to preprocess pre-trained model
$ tensorspacejs_converter \
--input_model_from="tensorflow" \
--input_model_format="tf_keras" \
--output_layer_names="conv_1,maxpool_1,conv_2,maxpool_2,dense_1,dense_2,softmax" \
./PATH/TO/MODEL/tf_keras_model.h5 \
./PATH/TO/SAVE/DIR
Step 2: Apply TensorSpace for model visualization
model.load({
type: "tensorflow",
url: "/PATH/TO/MODEL/model.json"
});
Converter API
Sample TensorSpace-Converter script:
$ tensorspacejs_converter \
--input_model_from="XXX" \
--input_model_format="YYY" \
--output_layer_names="EEE1,EEE2,EEE3" \
input_path \
output_path
Arguments explanation:
Positional Arguments | Description |
---|---|
input_path |
Full path of the saved model directory, session bundle directory, frozen model file or TensorFlow Hub module handle or path. |
output_path |
Path for all output artifacts. |
Options | Description |
---|---|
--input_model_from |
Configure the training library for pre-trained model, use: tensorflow for TensorFlow, keras for Keras, tfjs for TensorFlow.js |
--input_model_format |
The format of input model, use |
--output_layer_names |
The names of the layer which will be visualized in TensorSpace, separated by comma ",". If not set, converter will export all layers. |
Converter Usage Examples
This section introduces the usage of TensorSpace-Converter for different types of pre-trained model from TensorFlow, Keras, TensorFlow.js.
TensorFlow
A pre-trained model built by TensorFlow can be saved as saved model, frozen model, combined HDF5 model or separated HDF5 model. Use different TensorSpace-Converter commands for different kinds of TensorFlow model formats. TensorSpace-Converter collects the data from tensor
, then use the outputs as the inputs of layer
of TensorSpace visualization. The developer can collect all necessary tensor names and set the name list as output_layer_names
.
For a combined HDF5 model, topology and weights are saved in a combined HDF5 file xxx.h5
. Set input_model_format
to be tf_keras
. The sample command script should be like:
$ tensorspacejs_converter \
--input_model_from="tensorflow" \
--input_model_format="tf_keras" \
--output_layer_names="layer1Name,layer2Name,layer3Name" \
./PATH/TO/MODEL/xxx.h5 \
./PATH/TO/SAVE/DIR
For a separated HDF5 model, topology and weights are saved in separate files, topology file xxx.json
and weights file xxx.h5
. Set input_model_format
to be tf_keras_separated
. In this case, the model have two input files, merge two file's paths and separate them with comma (.json first, .h5 last), and then set the combined path to positional argument input_path
. The sample command script should be like:
$ tensorspacejs_converter \
--input_model_from="tensorflow" \
--input_model_format="tf_keras_separated" \
--output_layer_names="layer1Name,layer2Name,layer3Name" \
./PATH/TO/MODEL/xxx.json,./PATH/TO/MODEL/eee.h5 \
./PATH/TO/SAVE/DIR
For a TensorFlow saved model. Set input_model_format
to be tf_saved
. The sample command script should be like:
$ tensorspacejs_converter \
--input_model_from="tensorflow" \
--input_model_format="tf_saved" \
--output_layer_names="layer1Name,layer2Name,layer3Name" \
./PATH/TO/SAVED/MODEL/FOLDER \
./PATH/TO/SAVE/DIR
For a TensorFlow frozen model. Set input_model_format
to be tf_frozen
. The sample command script should be like:
$ tensorspacejs_converter \
--input_model_from="tensorflow" \
--input_model_format="tf_frozen" \
--output_layer_names="layer1Name,layer2Name,layer3Name" \
./PATH/TO/MODEL/xxx.pb \
./PATH/TO/SAVE/DIR
Checkout this TensorFlow Tutorial for more practical usage of TensorSpace-Converter for TensorFlow models.
Keras
A pre-trained model built by Keras, may have two formats: topology and weights are saved in a single HDF5 file, or topology and weights are saved in separated files. Use different TensorSpace-Converter commands for these two saved Keras models.
For a Keras model, topology and weights are saved in a single HDF5 file, i.e. xxx.h5
. Set input_model_format
to be topology_weights_combined
. The sample command script should be like:
$ tensorspacejs_converter \
--input_model_from="keras" \
--input_model_format="topology_weights_combined" \
--output_layer_names="layer1Name,layer2Name,layer3Name" \
./PATH/TO/MODEL/xxx.h5 \
./PATH/TO/SAVE/DIR
For a Keras model, topology and weights are saved in separated files, i.e. a topology file xxx.json
and a weights file xxx.h5
. Set input_model_format
to be topology_weights_separated
. In this case, the model have two input files, merge two file's paths and separate them with comma (.json first, .h5 last), and then set the combined path to positional argument input_path
. The sample command script should be like:
$ tensorspacejs_converter \
--input_model_from="keras" \
--input_model_format="topology_weights_separated" \
--output_layer_names="layer1Name,layer2Name,layer3Name" \
./PATH/TO/MODEL/xxx.json,./PATH/TO/MODEL/eee.h5 \
./PATH/TO/SAVE/DIR
Checkout this Keras Tutorial for more practical usage of TensorSpace-Converter for Keras models.
TensorFlow.js
A pre-trained model built by TensorFlow.js, may have a topology file xxx.json
and a weights file xxx.weight.bin
. To converter the model with TensorSpace-Converter, the two files should be put in the same folder and set topology file's path to input_path
. The sample command script should be like:
$ tensorspacejs_converter \
--input_model_from="tfjs" \
--output_layer_names="layer1Name,layer2Name,layer3Name" \
./PATH/TO/MODEL/xxx.json \
./PATH/TO/SAVE/DIR
Checkout this TensorFlow.js tutorial for more practical usage of TensorSpace-Converter for TensorFlow.js models.
Contributors
Thanks goes to these wonderful people (emoji key):
Chenhua Zhu 💻 🎨 📖 💡 |
syt123450 💻 🎨 📖 💡 |
---|
Contact
If you have any issue or doubt, feel free to contact us by:
- Email: tensorspaceteam@gmail.com
- GitHub Issues: create issue
- Slack: #questions
- Gitter: #Lobby
License
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for tensorspacejs-0.0.20-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | d95d0ae7d86a3c0ecc3a4649cbb477eccf46e68b71558f65aaba16e182cae317 |
|
MD5 | 91e2606dba1e5721e7cd347725ee62b4 |
|
BLAKE2b-256 | 2281c375adf771b9e9e6361870a2cc7d967afc313780516acce2290c161106d7 |