Skip to main content

A python library for neural network transfer learning.

Project description

About

We provide a set of transfer learning-based model training and evaluation functions. ImageNet pretrained weights are used.
The series of Efficient Net models are supported. There is always a trade-off between model size and accuracy. Our guideline is as follows:
For tfjs apps, use EfficientNetB0 or EfficientNetB1; For tf-lite apps, use EfficientNetB2 ~ B4; For desktop apps, use EfficientNetB5 and above.

The following table is from keras website:

Model Size (MB) Top-1 Accuracy Top-5 Accuracy Parameters Depth Time (ms) per inference step (CPU) Time (ms) per inference step (GPU)
Xception 88 79.0% 94.5% 22.9M 81 109.4 8.1
VGG16 528 71.3% 90.1% 138.4M 16 69.5 4.2
VGG19 549 71.3% 90.0% 143.7M 19 84.8 4.4
ResNet50 98 74.9% 92.1% 25.6M 107 58.2 4.6
ResNet50V2 98 76.0% 93.0% 25.6M 103 45.6 4.4
ResNet101 171 76.4% 92.8% 44.7M 209 89.6 5.2
ResNet101V2 171 77.2% 93.8% 44.7M 205 72.7 5.4
ResNet152 232 76.6% 93.1% 60.4M 311 127.4 6.5
ResNet152V2 232 78.0% 94.2% 60.4M 307 107.5 6.6
InceptionV3 92 77.9% 93.7% 23.9M 189 42.2 6.9
InceptionResNetV2 215 80.3% 95.3% 55.9M 449 130.2 10.0
MobileNet 16 70.4% 89.5% 4.3M 55 22.6 3.4
MobileNetV2 14 71.3% 90.1% 3.5M 105 25.9 3.8
DenseNet121 33 75.0% 92.3% 8.1M 242 77.1 5.4
DenseNet169 57 76.2% 93.2% 14.3M 338 96.4 6.3
DenseNet201 80 77.3% 93.6% 20.2M 402 127.2 6.7
NASNetMobile 23 74.4% 91.9% 5.3M 389 27.0 6.7
NASNetLarge 343 82.5% 96.0% 88.9M 533 344.5 20.0
EfficientNetB0 29 77.1% 93.3% 5.3M 132 46.0 4.9
EfficientNetB1 31 79.1% 94.4% 7.9M 186 60.2 5.6
EfficientNetB2 36 80.1% 94.9% 9.2M 186 80.8 6.5
EfficientNetB3 48 81.6% 95.7% 12.3M 210 140.0 8.8
EfficientNetB4 75 82.9% 96.4% 19.5M 258 308.3 15.1
EfficientNetB5 118 83.6% 96.7% 30.6M 312 579.2 25.3
EfficientNetB6 166 84.0% 96.8% 43.3M 360 958.1 40.4
EfficientNetB7 256 84.3% 97.0% 66.7M 438 1578.9 61.6
EfficientNetV2B0 29 78.7% 94.3% 7.2M - - -
EfficientNetV2B1 34 79.8% 95.0% 8.2M - - -
EfficientNetV2B2 42 80.5% 95.1% 10.2M - - -
EfficientNetV2B3 59 82.0% 95.8% 14.5M - - -
EfficientNetV2S 88 83.9% 96.7% 21.6M - - -
EfficientNetV2M 220 85.3% 97.4% 54.4M - - -
EfficientNetV2L 479 85.7% 97.5% 119.0M - - -

Install

pip install tensorflow-gpu >= 2
pip install tlearner

How to use

  1. Initialize

    from tlearner.efficientnet import transfer_learner learner = transfer_learner("my_model", W = 224) # my_model.h5df will be the model name

  2. Load data

    learner.load_dataset('my_data_folder', PKL_PATH = "my_dataset.pkl") # Images in each subfolder of my_data_folder are treated as one class. If PKL_PATH is specified, a pkl that contains X_train, y_train, X_val, y_val, etc. will be generated.
    print(learner.class_names) # print class names

  3. Train a new model

    hist = learner.train_custom_model("EfficientNetB1", batch = 8, epochs = [10,0], optimizer = "adam") # EfficientNetB0-B7 can be used.
    plot_history(hist) # plot training curves

  4. Evaluate

    learner.evaluate(N = 30) # predict the first N samples in X_val

  5. Predict

    learner.predict_file('my_image.jpg')

  6. Convert [Optional]

    learner.convert_to_tflite(v1 = True) # convert to a local tflite model

  7. Load an existing model

    learner.load_best_model() # if you have an existing local model
    learner.get_best_model().summary() # print model architecture
    learner.plot_best_model() # save model architecture to a local image file

Jupyter notebooks

Under /notebooks, we provide two examples. One is flower image classification; the other is fundus image classification.

Deployment

After training, you will get a keras h5 model file. You can further convert it to tflite format, or tfjs format (efficient net is not supported yet).
Then you can deploy on mobile device or browser-based apps.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tlearner-0.1.0.tar.gz (13.8 kB view details)

Uploaded Source

Built Distribution

tlearner-0.1.0-py3-none-any.whl (11.3 kB view details)

Uploaded Python 3

File details

Details for the file tlearner-0.1.0.tar.gz.

File metadata

  • Download URL: tlearner-0.1.0.tar.gz
  • Upload date:
  • Size: 13.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.9.12

File hashes

Hashes for tlearner-0.1.0.tar.gz
Algorithm Hash digest
SHA256 58ac329edaf69478f407282cdcb23f9cc3a7eb88522bcaef44515ab7d4af8c3e
MD5 aa86b5e6f07e3a0e97cc29c55eea7967
BLAKE2b-256 81c5465defbc95a7d4471efb85b41c0fbec287cf5990b0c8915b76df5b93f906

See more details on using hashes here.

File details

Details for the file tlearner-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: tlearner-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 11.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.9.12

File hashes

Hashes for tlearner-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 8557dd2292035d97b7030af4ec9adbe00889c861600cf93f3777417161ffd7d7
MD5 79f43af7e80eb7fa730c4ee6672bc501
BLAKE2b-256 efa92363a725f4555117d3a43f9d0eaeeb107626da21f3e80dc19f49da5be7f0

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page