Skip to main content

Transformers kit - Multi-task QA/Tagging/Multi-label Multi-Class Classification/Generation with BERT/ALBERT/T5/BERT

Project description




PyPI Download Build Last Commit CodeFactor Visitor

TFKit lets everyone make use of transformer architecture on many tasks and models in small change of config.
At the same time, it can do multi-task multi-model learning, and can introduce its own data sets and tasks through simple modifications.

Feature

  • One-click replacement of different pre-trained models
  • Support multi-model and multi-task
  • Classifier with multiple labels and multiple classifications
  • Unify input formats for different tasks
  • Separation of data reading and model architecture
  • Support various loss function and indicators

Supplement

  • Model list: Support Bert/GPT/GPT2/XLM/XLNet/RoBERTa/CTRL/ALBert/...
  • NLPrep: download and preprocessing data in one line
  • nlp2go: create demo api as quickly as possible.

Documentation

Learn more from the docs.

Quick Start

Installing via pip

pip install tfkit

Running TFKit to train a ner model

install nlprep and nlp2go

pip install nlprep  nlp2go -U

download dataset using nlprep

nlprep --dataset tag_clner  --outdir ./clner_row --util s2t

train model with albert

tfkit-train --batch 20 \
--epoch 5 \
--lr 5e-5 \
--train ./clner_row/clner-train.csv \
--test ./clner_row/clner-test.csv \
--maxlen 512 \
--model tagRow \
--savedir ./albert_ner \
--config voidful/albert_chinese_small

eval model

tfkit-eval --model ./albert_ner/3.pt --valid ./clner_row/validation.csv --metric clas

result

Task : default report 
TASK:  default 0
                precision    recall  f1-score   support

    B_Abstract       0.00      0.00      0.00         1
    B_Location       1.00      1.00      1.00         1
      B_Metric       1.00      1.00      1.00         1
B_Organization       0.00      0.00      0.00         1
      B_Person       1.00      1.00      1.00         1
    B_Physical       0.00      0.00      0.00         1
       B_Thing       1.00      1.00      1.00         1
        B_Time       1.00      1.00      1.00         1
    I_Abstract       1.00      1.00      1.00         1
    I_Location       1.00      1.00      1.00         1
      I_Metric       1.00      1.00      1.00         1
I_Organization       0.00      0.00      0.00         1
      I_Person       1.00      1.00      1.00         1
    I_Physical       0.00      0.00      0.00         1
       I_Thing       1.00      1.00      1.00         1
        I_Time       1.00      1.00      1.00         1
             O       1.00      1.00      1.00         1

     micro avg       1.00      0.71      0.83        17
     macro avg       0.71      0.71      0.71        17
  weighted avg       0.71      0.71      0.71        17
   samples avg       1.00      0.71      0.83        17

host prediction service

nlp2go --model ./albert_ner/3.pt --api_path ner

You can also try tfkit in Google Colab: Google Colab

Overview

Train

$ tfkit-train
Run training

arguments:
  --train TRAIN [TRAIN ...]     train dataset path
  --test TEST [TEST ...]        test dataset path
  --config CONFIG               distilbert-base-multilingual-cased/bert-base-multilingual-cased/voidful/albert_chinese_small
  --model {once,twice,onebyone,clas,tagRow,tagCol,qa,onebyone-neg,onebyone-pos,onebyone-both} [{once,twice,onebyone,clas,tagRow,tagCol,qa,onebyone-neg,onebyone-pos,onebyone-both} ...]
                                model task
  --savedir SAVEDIR     model saving dir, default /checkpoints
optional arguments:
  -h, --help            show this help message and exit
  --batch BATCH         batch size, default 20
  --lr LR [LR ...]      learning rate, default 5e-5
  --epoch EPOCH         epoch, default 10
  --maxlen MAXLEN       max tokenized sequence length, default 368
  --lossdrop            loss dropping for text generation
  --tag TAG [TAG ...]   tag to identity task in multi-task
  --seed SEED           random seed, default 609
  --worker WORKER       number of worker on pre-processing, default 8
  --grad_accum          gradient accumulation, default 1
  --tensorboard         Turn on tensorboard graphing
  --resume RESUME       resume training
  --cache               cache training data

Eval

$ tfkit-eval
Run evaluation on different benchmark
arguments:
  --model MODEL             model path
  --metric {emf1,nlg,clas}  evaluate metric
  --valid VALID             evaluate data path

optional arguments:
  -h, --help            show this help message and exit
  --print               print each pair of evaluate data
  --enable_arg_panel    enable panel to input argument

Contributing

Thanks for your interest.There are many ways to contribute to this project. Get started here.

License PyPI - License

Icons reference

Icons modify from Freepik from www.flaticon.com
Icons modify from Nikita Golubev from www.flaticon.com

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tfkit-0.6.22.tar.gz (228.2 kB view details)

Uploaded Source

Built Distributions

tfkit-0.6.22-py3.7.egg (178.3 kB view details)

Uploaded Source

tfkit-0.6.22-py3-none-any.whl (80.0 kB view details)

Uploaded Python 3

File details

Details for the file tfkit-0.6.22.tar.gz.

File metadata

  • Download URL: tfkit-0.6.22.tar.gz
  • Upload date:
  • Size: 228.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/50.0.3 requests-toolbelt/0.9.1 tqdm/4.48.2 CPython/3.7.8

File hashes

Hashes for tfkit-0.6.22.tar.gz
Algorithm Hash digest
SHA256 0266fe26ba90dbc3d35d7f57a9a3e89193289d6ad5a1ec31d41525e4398d6403
MD5 e78e1b46537ff98bb239464305e241b3
BLAKE2b-256 cbc4322d538cb2d913a2e1b5bea5725fecfd9925e8cd0f82e1804948b3630abb

See more details on using hashes here.

File details

Details for the file tfkit-0.6.22-py3.7.egg.

File metadata

  • Download URL: tfkit-0.6.22-py3.7.egg
  • Upload date:
  • Size: 178.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/50.0.3 requests-toolbelt/0.9.1 tqdm/4.48.2 CPython/3.7.8

File hashes

Hashes for tfkit-0.6.22-py3.7.egg
Algorithm Hash digest
SHA256 9b16feef02a12b0d9c4bb4242e079cecfc189713b0de4ac54259be30d52bec37
MD5 36a54c7406ba0f3fba13afe3282a8085
BLAKE2b-256 195248027985815290d86184da6b0b9487e4249061a18e126930b3265cc20f6d

See more details on using hashes here.

File details

Details for the file tfkit-0.6.22-py3-none-any.whl.

File metadata

  • Download URL: tfkit-0.6.22-py3-none-any.whl
  • Upload date:
  • Size: 80.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/50.0.3 requests-toolbelt/0.9.1 tqdm/4.48.2 CPython/3.7.8

File hashes

Hashes for tfkit-0.6.22-py3-none-any.whl
Algorithm Hash digest
SHA256 f7808c4382953394a046e302a03c7cf454155c10f0fc94c959b56e43ba0fc102
MD5 56060d41969417962c4440a042751767
BLAKE2b-256 6d5084ec71d4c188f842f810d735ea6fc597879fa10adb80ec73d9a9ee574ee2

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page