Intel OpenVINO extension for Hugging Face Transformers
Project description
OpenVINO™ Integration with Optimum*
This module is an extension for Optimum* library which brings OpenVINO™ backend for Hugging Face Transformers* :hugs:.
This project provides multiple APIs to enable different tools:
Install
Install only runtime:
pip install openvino-optimum
or with all dependencies (nncf
and openvino-dev
):
pip install openvino-optimum[all]
OpenVINO Runtime
This module provides an inference API for Hugging Face models. There are options to use models with PyTorch*, TensorFlow* pretrained weights or use native OpenVINO IR format (a pair of files ov_model.xml
and ov_model.bin
).
To use OpenVINO backend, import one of the AutoModel
classes with OV
prefix. Specify a model name or local path in from_pretrained
method.
from optimum.intel.openvino import OVAutoModel
# PyTorch trained model with OpenVINO backend
model = OVAutoModel.from_pretrained(<name_or_path>, from_pt=True)
# TensorFlow trained model with OpenVINO backend
model = OVAutoModel.from_pretrained(<name_or_path>, from_tf=True)
# Initialize a model from OpenVINO IR
model = OVAutoModel.from_pretrained(<name_or_path>)
NNCF
NNCF is used for model training with applying such features like quantization, pruning. To enable NNCF in your training pipeline do the following steps:
- Import
NNCFAutoConfig
:
from optimum.intel.nncf import NNCFAutoConfig
NOTE:
NNCFAutoConfig
must be imported beforetransformers
to make magic work
- Initialize a config from
.json
file:
nncf_config = NNCFAutoConfig.from_json(training_args.nncf_config)
- Pass a config to
Trainer
object. In example,
model = AutoModelForQuestionAnswering.from_pretrained(<name_op_path>)
...
trainer = QuestionAnsweringTrainer(
model=model,
args=training_args,
train_dataset=train_dataset if training_args.do_train else None,
eval_dataset=eval_dataset if training_args.do_eval else None,
eval_examples=eval_examples if training_args.do_eval else None,
tokenizer=tokenizer,
data_collator=data_collator,
post_process_function=post_processing_function,
compute_metrics=compute_metrics,
nncf_config=nncf_config,
)
NNCF module is independent from the Runtime module so model
class should not be wrapped to one of OVAutoModel
classes.
Training examples can be found in Transformers library.
NNCF configs are published in config folder. Add --nncf_config
with a path to corresponding config when training your model. More command line examples here.
python examples/pytorch/token-classification/run_ner.py --model_name_or_path bert-base-cased --dataset_name conll2003 --output_dir bert_base_cased_conll_int8 --do_train --do_eval --save_strategy epoch --evaluation_strategy epoch --nncf_config nncf_bert_config_conll.json
To use the NNCF component, install the package with [nncf]
or [all]
extras:
pip install openvino-optimum[nncf]
See the Changelog page for details about module development.
*Other names and brands may be claimed as the property of others.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
Built Distribution
File details
Details for the file openvino_optimum-0.0.2-py3-none-any.whl
.
File metadata
- Download URL: openvino_optimum-0.0.2-py3-none-any.whl
- Upload date:
- Size: 23.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/32.0 requests/2.22.0 requests-toolbelt/0.9.1 urllib3/1.26.8 tqdm/4.63.0 importlib-metadata/4.11.0 keyring/18.0.1 rfc3986/2.0.0 colorama/0.4.3 CPython/3.8.10
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 7cc57743e957e8970b92f6814bcc29d654fc059f6448a4373c9fc14a2810dc22 |
|
MD5 | b083567879cc605cad5876b77497f562 |
|
BLAKE2b-256 | e6aeb5c3bb37b19aa026c7fdeb9e29ee0e01d3d10a3a6cc67d0dfaaf6a2d8093 |