Train a model using transfer learning and serve it using TF Serving.
Project description
transferit
This repository contains a Python package that can help you train an image classification model using transfer learning and serve the model with TensorFlow Serving and Docker.
The repo contains sample images you can use to train the model to tell a certain kind of collectible playing card, namely Magic: The Gathering cards from other objects. These, however, only serve as an example. You can easily use the transferit package to train models on your own data.
Installation
Install from PyPI:
pip3 install transferit
To get the example data and run the examples, you will need to check out the repository:
git clone git@github.com:sorenlind/transferit.git
Quick start
- Clone the repo
- Install the package either from source or from PyPI
- Download the Caltech 256
dataset and store the
.tar
file inside thedata/raw
folder. - Run the
01 Create model.ipynb
notebook in thenotebooks
folder. - Build the docker image:
docker build -t transferit .
- Start the container:
docker run -t --rm -p 8501:8501 transferit
- Run the
02 API Example usage.ipynb
notebook in thenotebooks
folder.
Introduction
The package provides a command line application, transferit
which can help you prepare
date for training and evaluation as well as training the model. Finally, it can also
wrap or package the trained model in way that makes it compatible with TensorFlow
Serving. The transferit
command has four sub commands as briefly explained below:
create-class
: Copy and resize images in a specified folder (and its subfolders) to another folder. This is handy if, for example you are training a binary image classifier and you have a library of various kinds of images which you will use for the negative class and a smaller set of custom images that you will use for the positive class. Running this command twice (once for the positive and once for the negative) class can create a complete data set for you.split
: Creates a train / dev split using a dataset already prepared using thecreate-class
sub command.train
: Train the actual model using the training and dev data created using thesplit
sub command.wrap
: Wrap a trained model to make it compatible with TensorFlow Serving and ready to be copied to a docker image.
In addition to running the command line application you can also call the relevant
functions from your Python code such as a Jupyter notebook. The notebooks
folder
contains a notebook 01 Create model.ipynb
which runs through the entire process of
preparing data, training a model and wrapping it for serving. Note that before you can
run the notebook, you will have to download the Caltech 256
dataset and store the .tar
file inside the data/raw
folder.
Preparing images
transferit create-class ./raw/256_ObjectCategories/ ./prepared/full/negative --n-max 3000
transferit create-class ./raw/cards/ ./prepared/full/positive
Creating train / dev split
transferit split ./prepared/full/ ./prepared/
Training model
transferit train ./prepared/train/ ./prepared/dev/ ./models/naked/
Wrapping up model for TF Serving
transferit wrap models/naked/models_best_loss.hdf5 ./models/wrapped/00000001/ -c Negative Positive
Creating Dockerfile from the template
The repository contains a template for a Dockerfile called Dockerfile.template
. You
can create a copy of this simply called Dockerfile
and edit it to match your setup. If
you have been running the Jupyter notebook to train and wrap a model, you do not need to
make any changes to the Dockerfile.
Serving model locally using Docker
Build the image:
docker build -t transferit .
Once you have built the image, you can serve the model in a container as follows:
docker run -t --rm -p 8501:8501 transferit
Once the container is running, you can access it as shown in the example below. The
notebooks
folder contains a notebook called 02 API Example usage.ipynb
which has
similar code and classifies two images from the dev dataset.
import base64
import json
import requests
URL = "http://localhost:8501/v1/models/transferit:classify"
HEADERS = {"content-type": "application/json"}
with open(img_filename, mode="rb") as file:
img = file.read()
jpeg_bytes = base64.b64encode(img).decode("utf-8")
body = {
"signature_name": "serving_default",
"examples": [
{
"x": {"b64": jpeg_bytes},
}
],
}
json_response = requests.post(URL, data=json.dumps(body), headers=HEADERS)
json_response.status_code
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file transferit-0.1.0.tar.gz
.
File metadata
- Download URL: transferit-0.1.0.tar.gz
- Upload date:
- Size: 13.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.4.1 importlib_metadata/4.0.1 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.60.0 CPython/3.8.9
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 8613725f81ead47e087ffc3678586c5a5fd8759749d76fda87e69c4043ff3817 |
|
MD5 | a7d97032e1cad8e93197acb8294c22cb |
|
BLAKE2b-256 | 6edce4ba434d83d0557c0b20ed52edbab03d728e2d19223081431ba6baa31ddb |
File details
Details for the file transferit-0.1.0-py3-none-any.whl
.
File metadata
- Download URL: transferit-0.1.0-py3-none-any.whl
- Upload date:
- Size: 12.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.4.1 importlib_metadata/4.0.1 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.60.0 CPython/3.8.9
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | cbffb39e68a69e9c280418900d8938548b0f0f302000544a3c0503b7b5114aaf |
|
MD5 | 94308764ce95e37d7900e7e1380ebc95 |
|
BLAKE2b-256 | 8efeb7ee6c1d30cf9f5a8b61b9d246891071af5ce03d7f8b47ea8db53df5112f |