Export the OpenVINO model files to disk#
When a forecaster is accelerated by OpenVINO, we can save the OpenVINO model files to disk by calling
export_openvino_file. In this guidance, we demonstrate how to export the OpenVINO model files to disk in detail.
We will take
TCNForecaster and nyc_taxi dataset as an example in this guide.
Before we begin, we need to install Chronos if it isn’t already available, we choose to use pytorch as deep learning backend.
!pip install --pre --upgrade bigdl-chronos[pytorch] # install OpenVINO !pip install openvino-dev # fix conflict with google colab !pip uninstall -y torchtext
Although Chronos supports inferencing on a cluster, the method to export model files can only be used when forecaster is a non-distributed version.
Only pytorch backend deep learning forecasters support openvino acceleration.
Before the exporting process, a forecaster should be created and trained. The training process is introduced in the previous guidance Train forcaster on single node in detail, therefore we directly create and train a
TCNForecaster based on the nyc taxi dataset.
Export the OpenVINO model files#
When a trained forecaster is ready and forecaster is a non-distributed version, we provide with
export_openvino_file method to export the OpenVINO model files to disk. The
export_openvino_file method has 2 parameters:
dirname is the location to save the OpenVINO files, and
quantized_dirname is the location to save the quantized OpenVINO model files when you have a quantized forecaster.
from pathlib import Path # get data for training and testing and validating train_data, test_data, val_data = get_data() # get a trained forecaster forecaster = get_trained_forecaster(train_data) # quantize the forecaster forecaster.quantize(calib_data=train_data, val_data=val_data, framework="openvino") # create a directory to save openvino files dirname = Path("ov_files") dirname.mkdir(exist_ok=True) ckpt_name = dirname / "fp32_openvino" ckpt_name_q = dirname / "int_openvino" # export the openvino files forecaster.export_openvino_file(dirname=ckpt_name, quantized_dirname=ckpt_name_q)
export_openvino_fileis called, the forecaster will automatically build an OpenVINO session with default settings. So you can directly call this method without calling
predict_with_openvinofirst. But when you want to export quantized openvino model files, you should quantize the forecaster by calling
If you just need to export fp32 openvino files, you could specify
dirnameonly and set
The files exported will be saved at
There are 3 files in each subdirectory:
nano_model_meta.yml: meta information of the saved model checkpoint
ov_saved_model.bin: contains the weights and biases binary data of model
ov_saved_model.xml: model checkpoint for general use, describes model structure
You only need to take
.xml file for futher usage.