Tensor Flow Model Server
tfserver is an example for serving Tensorflow model with Skitai App Engine.
It can serve through gRPC and JSON RESTful API with Atila WSGI container.
This project is inspired by issue #176.
I’m so sorry about this soulless manual.
- from version 0.3, it is now TensorFlow 2+ compatible (deprecated, no longer support)
- from version 0.4, TensorFlow 2+ only
Building Model and Deploy
It is mostly used tensorflow keras and dnn.
Creating Your Own Tensorflow Server
You can know how to serve gRPC service and make yoyr own APIs.
APIs to manage models and basic inference.
- 0.4 (2021. 4)
- upgrade for tensorflow 2
- 0.3 (2020. 6. 28)
- add model management APIs
- reactivate project and compatible with TF2+
- 0.2 (2020. 6. 26): integrated with dnn 0.3
- 0.1b8 (2018. 4. 13): fix grpc trailers, skitai upgrade is required
- 0.1b6 (2018. 3. 19): found works only grpcio 1.4.0
- 0.1b3 (2018. 2. 4): add @app.umounted decorator for clearing resource
- 0.1b2: remove self.tfsess.run (tf.global_variables_initializer())
- 0.1b1 (2018. 1. 28): Beta release
- 0.1a (2018. 1. 4): Alpha release
Release history Release notifications | RSS feed
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
|Filename, size||File type||Python version||Upload date||Hashes|
|Filename, size tfserver-0.4.0-py3-none-any.whl (28.8 kB)||File type Wheel||Python version py3||Upload date||Hashes View|