Intel OpenVINO extension for Hugging Face Transformers
Project description
Optimum OpenVINO
Optimum OpenVINO is an extension for Optimum library which brings Intel OpenVINO backend for Hugging Face Transformers :hugs:.
This project provides multiple APIs to enable different tools:
Install
Install only runtime:
pip install optimum-openvino
or with all dependencies:
pip install optimum-openvino[all]
OpenVINO Runtime
This module provides an inference API for Hugging Face models. There are options to use models with PyTorch*, TensorFlow* pretrained weights or use native OpenVINO IR format (a pair of files ov_model.xml
and ov_model.bin
).
To use OpenVINO backend, import one of the AutoModel
classes with OV
prefix. Specify a model name or local path in from_pretrained
method.
from optimum.intel.openvino import OVAutoModel
# PyTorch trained model with OpenVINO backend
model = OVAutoModel.from_pretrained(<name_or_path>, from_pt=True)
# TensorFlow trained model with OpenVINO backend
model = OVAutoModel.from_pretrained(<name_or_path>, from_tf=True)
# Initialize a model from OpenVINO IR
model = OVAutoModel.from_pretrained(<name_or_path>)
NNCF
NNCF is used for model training with applying such features like quantization, pruning. To enable NNCF in you training pipeline do the following steps:
- Import
NNCFAutoConfig
:
from optimum.intel.nncf import NNCFAutoConfig
NOTE:
NNCFAutoConfig
must be imported beforetransformers
to make magic work
- Initialize a config from
.json
file:
nncf_config = NNCFAutoConfig.from_json(training_args.nncf_config)
- Pass a config to
Trainer
object. In example,
trainer = QuestionAnsweringTrainer(
model=model,
args=training_args,
train_dataset=train_dataset if training_args.do_train else None,
eval_dataset=eval_dataset if training_args.do_eval else None,
eval_examples=eval_examples if training_args.do_eval else None,
tokenizer=tokenizer,
data_collator=data_collator,
post_process_function=post_processing_function,
compute_metrics=compute_metrics,
nncf_config=nncf_config,
)
Training examples can be found in Transformers library.
NNCF configs are published in config folder. Add --nncf_config
with a path to corresponding config when train your model. More command line examples here.
python examples/pytorch/token-classification/run_ner.py --model_name_or_path bert-base-cased --dataset_name conll2003 --output_dir bert_base_cased_conll_int8 --do_train --do_eval --save_strategy epoch --evaluation_strategy epoch --nncf_config nncf_bert_config_conll.json
To use NNCF component, install the package with [nncf]
or [all]
extras:
pip install optimum-openvino[nncf]
POT
TBD
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
Built Distribution
Hashes for optimum_openvino-0.0.2-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | ed83e7837b7c6ec5b2d6ef6f2c2c7654f98e7f0195ccccce0e64667707d6821f |
|
MD5 | e505c1bd1541f8e24e41904a252a8465 |
|
BLAKE2b-256 | 478371c2d01d74ea472c897971a1d15a1c30d5ca005965ddc61674ceefd07792 |