Skip to main content

onnx model for transformers pipeline

Project description

transformer_onnx

transformer_onnx is a simple package which can use inside transformers pipeline.

Install

pip install transformer_onnx

Convert model into Onnx format

#for question-answering
python -m transformers.onnx --feature "question-answering" -m nlpconnect/roberta-base-squad2-nq ./qa/

#for text-classification or zeroshot classification
python -m transformers.onnx --feature "sequence-classification" -m cross-encoder/nli-roberta-base ./classifier/

#for feature-extraction (last_hidden_state or pooler_output)
python -m transformers.onnx --feature "default" -m nlpconnect/dpr-ctx_encoder_bert_uncased_L-2_H-128_A-2 ./feature/

#for token-classification
python -m transformers.onnx --feature "token-classification" -m dslim/bert-base-NER ./ner/

Use transformer_onnx to run transformers pipeline

Question Answering

from transformers import pipeline, AutoTokenizer, AutoConfig
from transformer_onnx import OnnxModel

model = OnnxModel("qa/model.onnx", task="question-answering")
model.config = AutoConfig.from_pretrained("nlpconnect/roberta-base-squad2-nq")
tokenizer = AutoTokenizer.from_pretrained("nlpconnect/roberta-base-squad2-nq")
qa = pipeline("question-answering", model=model, tokenizer=tokenizer)

# Input data
context = ["Released on 6/03/2021",
        "Release delayed until the 11th of August",
        "Documentation can be found here: huggingface.com"]
# Define column queries
queries = ["What is Released date?", "till when delayed?", "What is the url?"]
qa(context=context, question=queries)

Text Classification/ Zero shot classification

from transformers import pipeline, AutoTokenizer, AutoConfig
from transformer_onnx import OnnxModel

model = OnnxModel("classifier/model.onnx", task="sequence-classification")
model.config = AutoConfig.from_pretrained("cross-encoder/nli-roberta-base")
tokenizer = AutoTokenizer.from_pretrained("cross-encoder/nli-roberta-base")
zero_shot = pipeline("zero-shot-classification", model=model, tokenizer=tokenizer)
zero_shot(sequences=["Hello Hiiii", "I am playing football"], candidate_labels=["Greeting", "Sports"])

Feature Extraction

from transformers import pipeline, AutoTokenizer, AutoConfig
from transformer_onnx import OnnxModel

# for last_hidden_state
model = OnnxModel("feature/model.onnx", task="last_hidden_state")
tokenizer = AutoTokenizer.from_pretrained("nlpconnect/dpr-ctx_encoder_bert_uncased_L-2_H-128_A-2")
feature_extractor = pipeline("feature-extraction", model=model, tokenizer=tokenizer)
feature_extractor(["Hello Hiiii", "I am playing football"])

# for pooler_output
model = OnnxModel("feature/model.onnx", task="pooler_output")
tokenizer = AutoTokenizer.from_pretrained("nlpconnect/dpr-ctx_encoder_bert_uncased_L-2_H-128_A-2")
feature_extractor = pipeline("feature-extraction", model=model, tokenizer=tokenizer)
feature_extractor(["Hello Hiiii", "I am playing football"])

NER

from transformers import pipeline, AutoTokenizer, AutoConfig
from transformer_onnx import OnnxModel

model = OnnxModel("ner/model.onnx", task="token-classification")
model.config = AutoConfig.from_pretrained("dslim/bert-base-NER")
tokenizer = AutoTokenizer.from_pretrained("dslim/bert-base-NER")
ner = pipeline("token-classification", model=model, tokenizer=tokenizer)
ner("My name is transformers and I live in github/huggingface")

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

transformers_onnx-0.0.1.tar.gz (8.6 kB view details)

Uploaded Source

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

transformers_onnx-0.0.1-py3.7.egg (7.1 kB view details)

Uploaded Egg

transformers_onnx-0.0.1-py3-none-any.whl (8.5 kB view details)

Uploaded Python 3

File details

Details for the file transformers_onnx-0.0.1.tar.gz.

File metadata

  • Download URL: transformers_onnx-0.0.1.tar.gz
  • Upload date:
  • Size: 8.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.0 CPython/3.7.13

File hashes

Hashes for transformers_onnx-0.0.1.tar.gz
Algorithm Hash digest
SHA256 66b102efeff8304c1b1d293131b15c7a110374ab68b4b7182c12dd8cd051d7ce
MD5 4f2c7680b16c71073af59cc26677dde2
BLAKE2b-256 1da9a858a8bb9695725ff0d462c78566d4d4618696b1a11eb968d83087f2b157

See more details on using hashes here.

File details

Details for the file transformers_onnx-0.0.1-py3.7.egg.

File metadata

  • Download URL: transformers_onnx-0.0.1-py3.7.egg
  • Upload date:
  • Size: 7.1 kB
  • Tags: Egg
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.0 CPython/3.7.13

File hashes

Hashes for transformers_onnx-0.0.1-py3.7.egg
Algorithm Hash digest
SHA256 3bdc42ebfd62836a999bfc9f7e33913ae52bd0aee60ef8d5f213ebacc87ae61b
MD5 075fa512253ff886a25c12dbf5c3f7f9
BLAKE2b-256 840fc7ad1e45791e9df1a72cbc793a7faec3504eec8242c8ee374ec31559037a

See more details on using hashes here.

File details

Details for the file transformers_onnx-0.0.1-py3-none-any.whl.

File metadata

File hashes

Hashes for transformers_onnx-0.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 c52b5ba92e1f44c4b0118e5dd7bee6585c5cba04164ba912389e64301053c438
MD5 a0b618927fe63ac89ea10c7dbb363bc6
BLAKE2b-256 c46bb7d1492ce2a62a4b0e606ff5f199a3ed39db356a1afac9e3f29cfa90ff4f

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page