Open In Colab

image.png

Export the TorchScript model files to disk#

Introduction#

When a forecaster is accelerated using jit, we can save the torchscript model files to disk by calling export_torchscript_file. In this guidance, we demonstrate how to export the torchscript model files to disk in detail.

We will take TCNForecaster and nyc_taxi dataset as an example in this guide.

Setup#

Before we begin, we need to install Chronos if it isn’t already available, we choose to use pytorch as deep learning backend.

[ ]:
!pip install --pre --upgrade bigdl-chronos[pytorch]
# fix conflict with google colab
!pip uninstall -y torchtext

πŸ“Note

  • Although Chronos supports inferencing on a cluster, the method to export model files can only be used when forecaster is a non-distributed version.

  • Only pytorch backend deep learning forecasters support jit acceleration.

Forecaster preparation#

Before the exporting process, a forecaster should be created and trained. The training process is introduced in the previous guidance Train forcaster on single node in detail, therefore we directly create and train a TCNForecaster based on the nyc taxi dataset.

Export the torchscript model files#

When a trained forecaster is ready and forecaster is a non-distributed version, we provide with export_torchscript_file method to export the torchscript model files to disk. The export_torchscript_file method has 2 parameters: dirname is the location to save the torchscript files, and quantized_dirname is the location to save the quantized torchscript model files. But the quantization of jit model is not supported yet, so we set it to None.

[ ]:
from pathlib import Path

# get data for training and testing and validating
train_data, test_data, val_data = get_data()
# get a trained forecaster
forecaster = get_trained_forecaster(train_data)

# create a directory to save torchscript files
dirname = Path("torchscript_files")
dirname.mkdir(exist_ok=True)
ckpt_name = dirname / "fp32_torch_script"

# export the torchscript files
forecaster.export_torchscript_file(dirname=ckpt_name, quantized_dirname=None)

πŸ“Note

  • When export_torchscript_file is called, the forecaster will automatically build an jit session with default settings. So you can directly call this method without calling predict_with_jit first.

The files exported will be saved at torchscript_files directory.

There are 2 files in each subdirectory:

  • nano_model_meta.yml: meta information of the saved model checkpoint

  • ckpt.pth: JIT model checkpoint for general use, describes model structure

You only need to take ckpt.pth file for futher usage.