Intel OpenVINO extension for Hugging Face Transformers
Project description
Optimum OpenVINO
Optimum OpenVINO is an extension for Optimum library which brings Intel OpenVINO backend for Hugging Face Transformers :hugs:.
This project provides multiple APIs to enable different tools:
Install
Install only runtime:
pip install optimum-openvino
or with all dependencies:
pip install optimum-openvino[all]
OpenVINO Runtime
This module provides an inference API for Hugging Face models. There are options to use models with PyTorch*, TensorFlow* pretrained weights or use native OpenVINO IR format (a pair of files ov_model.xml
and ov_model.bin
).
To use OpenVINO backend, import one of the AutoModel
classes with OV
prefix. Specify a model name or local path in from_pretrained
method.
from optimum.intel.openvino import OVAutoModel
# PyTorch trained model with OpenVINO backend
model = OVAutoModel.from_pretrained(<name_or_path>, from_pt=True)
# TensorFlow trained model with OpenVINO backend
model = OVAutoModel.from_pretrained(<name_or_path>, from_tf=True)
# Initialize a model from OpenVINO IR
model = OVAutoModel.from_pretrained(<name_or_path>)
NNCF
NNCF is used for model training with applying such features like quantization, pruning. To enable NNCF in you training pipeline do the following steps:
- Import
NNCFAutoConfig
:
from optimum.intel.nncf import NNCFAutoConfig
NOTE:
NNCFAutoConfig
must be imported beforetransformers
to make magic work
- Initialize a config from
.json
file:
nncf_config = NNCFAutoConfig.from_json(training_args.nncf_config)
- Pass a config to
Trainer
object. In example,
trainer = QuestionAnsweringTrainer(
model=model,
args=training_args,
train_dataset=train_dataset if training_args.do_train else None,
eval_dataset=eval_dataset if training_args.do_eval else None,
eval_examples=eval_examples if training_args.do_eval else None,
tokenizer=tokenizer,
data_collator=data_collator,
post_process_function=post_processing_function,
compute_metrics=compute_metrics,
nncf_config=nncf_config,
)
Training examples can be found in Transformers library.
NNCF configs are published in config folder. Add --nncf_config
with a path to corresponding config when train your model. More command line examples here.
python examples/pytorch/token-classification/run_ner.py --model_name_or_path bert-base-cased --dataset_name conll2003 --output_dir bert_base_cased_conll_int8 --do_train --do_eval --save_strategy epoch --evaluation_strategy epoch --nncf_config nncf_bert_config_conll.json
To use NNCF component, install the package with [nncf]
or [all]
extras:
pip install optimum-openvino[nncf]
POT
TBD
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
Built Distribution
Hashes for optimum_openvino-0.0.2.dev0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 6afd3c178de863fef6b7bdd85a821b00fa96a7593364d4a596eeea6cb6c82d3d |
|
MD5 | 77de5474005998639ee4366a6963e1c3 |
|
BLAKE2b-256 | 061e68623efe6212bbce4813f1b4b9c0fb5753b0ffab5f217297b6cb3b0e960e |