BigDL-LLM Installation: GPU#

Windows#

Prerequisites#

BigDL-LLM on Windows supports Intel iGPU and dGPU.

Important

BigDL-LLM on Windows only supports PyTorch 2.1.

To apply Intel GPU acceleration, there’re several prerequisite steps for tools installation and environment preparation:

  • Step 1: Install Visual Studio 2022 Community Edition and select “Desktop development with C++” workload, like this

  • Step 2: Install or update to latest GPU driver

  • Step 3: Install Intel® oneAPI Base Toolkit 2024.0

Intel® oneAPI Base Toolkit 2024.0 installation methods:

Download and install Intel® oneAPI Base Toolkit version 2024.0 through Offline Installer.

During installation, you could just continue with “Recommended Installation”. If you would like to continue with “Custom Installation”, please note that oneAPI Deep Neural Network Library, oneAPI Math Kernel Library, and oneAPI DPC++/C++ Compiler are required, the other components are optional.

Install BigDL-LLM From PyPI#

We recommend using miniconda to create a python 3.9 enviroment:

Important

bigdl-llm is tested with Python 3.9, 3.10 and 3.11. Python 3.9 is recommended for best practices.

The easiest ways to install bigdl-llm is the following commands:

conda create -n llm python=3.9 libuv
conda activate llm

pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu

Install BigDL-LLM From Wheel#

If you encounter network issues when installing IPEX, you can also install BigDL-LLM dependencies for Intel XPU from source archives. First you need to download and install torch/torchvision/ipex from wheels listed below before installing bigdl-llm.

Download the wheels on Windows system:

wget https://intel-extension-for-pytorch.s3.amazonaws.com/ipex_stable/xpu/torch-2.1.0a0%2Bcxx11.abi-cp39-cp39-win_amd64.whl
wget https://intel-extension-for-pytorch.s3.amazonaws.com/ipex_stable/xpu/torchvision-0.16.0a0%2Bcxx11.abi-cp39-cp39-win_amd64.whl
wget https://intel-extension-for-pytorch.s3.amazonaws.com/ipex_stable/xpu/intel_extension_for_pytorch-2.1.10%2Bxpu-cp39-cp39-win_amd64.whl

You may install dependencies directly from the wheel archives and then install bigdl-llm using following commands:

pip install torch-2.1.0a0+cxx11.abi-cp39-cp39-win_amd64.whl
pip install torchvision-0.16.0a0+cxx11.abi-cp39-cp39-win_amd64.whl
pip install intel_extension_for_pytorch-2.1.10+xpu-cp39-cp39-win_amd64.whl

pip install --pre --upgrade bigdl-llm[xpu]

Note

All the wheel packages mentioned here are for Python 3.9. If you would like to use Python 3.10 or 3.11, you should modify the wheel names for torch, torchvision, and intel_extension_for_pytorch by replacing cp39 with cp310 or cp311, respectively.

Runtime Configuration#

To use GPU acceleration on Windows, several environment variables are required before running a GPU example.

Make sure you are using CMD (Anaconda Prompt if using conda) as PowerShell is not supported. For oneAPI installed using the Offline installer, configure oneAPI environment variables with:

call "C:\Program Files (x86)\Intel\oneAPI\setvars.bat"

Please also set the following environment variable if you would like to run LLMs on:

set SYCL_CACHE_PERSISTENT=1
set BIGDL_LLM_XMX_DISABLED=1

Note

For the first time that each model runs on Intel iGPU/Intel Arc™ A300-Series or Pro A60, it may take several minutes to compile.

Troubleshooting#

1. Error loading intel_extension_for_pytorch#

If you met error when importing intel_extension_for_pytorch, please ensure that you have completed the following steps:

  • Ensure that you have installed Visual Studio with “Desktop development with C++” workload.

  • Make sure that the correct version of oneAPI, specifically 2024.0, is installed.

  • Ensure that libuv is installed in your conda environment. This can be done during the creation of the environment with the command:

    conda create -n llm python=3.9 libuv
    

    If you missed libuv, you can add it to your existing environment through

    conda install libuv
    
  • For oneAPI installed using the Offline installer, make sure you have configured oneAPI environment variables in your Anaconda Prompt through

    call "C:\Program Files (x86)\Intel\oneAPI\setvars.bat"
    

    Please note that you need to set these environment variables again once you have a new Anaconda Prompt window.

Linux#

Prerequisites#

BigDL-LLM GPU support on Linux has been verified on:

  • Intel Arc™ A-Series Graphics

  • Intel Data Center GPU Flex Series

  • Intel Data Center GPU Max Series

Important

BigDL-LLM on Linux supports PyTorch 2.0 and PyTorch 2.1.

Important

We currently support the Ubuntu 20.04 operating system and later.

To enable BigDL-LLM for Intel GPUs with PyTorch 2.1, here are several prerequisite steps for tools installation and environment preparation:

  • Step 1: Install Intel GPU Driver version >= stable_775_20_20231219. We highly recommend installing the latest version of intel-i915-dkms using apt.

    See also

    Please refer to our driver installation for general purpose GPU capabilities.

    See release page for latest version.

  • Step 2: Download and install Intel® oneAPI Base Toolkit with version 2024.0. OneDNN, OneMKL and DPC++ compiler are needed, others are optional.

Intel® oneAPI Base Toolkit 2024.0 installation methods:

Step 1: Install oneAPI in a user-defined folder, e.g., ~/intel/oneapi.

export PYTHONUSERBASE=~/intel/oneapi
pip install dpcpp-cpp-rt==2024.0.2 mkl-dpcpp==2024.0.0 onednn==2024.0.0 --user

Note

The oneAPI packages are visible in pip list only if PYTHONUSERBASE is properly set.

Step 2: Configure your working conda environment (e.g. with name llm) to append oneAPI path (e.g. ~/intel/oneapi/lib) to the environment variable LD_LIBRARY_PATH.

conda env config vars set LD_LIBRARY_PATH=$LD_LIBRARY_PATH:~/intel/oneapi/lib -n llm

Note

You can view the configured environment variables for your environment (e.g. with name llm) by running conda env config vars list -n llm. You can continue with your working conda environment and install bigdl-llm as guided in the next section.

Note

You are recommended not to install other pip packages in the user-defined folder for oneAPI (e.g. ~/intel/oneapi). You can uninstall the oneAPI package by simply deleting the package folder, and unsetting the configuration of your working conda environment (e.g., with name llm).

rm -r ~/intel/oneapi
conda env config vars unset LD_LIBRARY_PATH -n llm

Install BigDL-LLM From PyPI#

We recommend using miniconda to create a python 3.9 enviroment:

Important

bigdl-llm is tested with Python 3.9, 3.10 and 3.11. Python 3.9 is recommended for best practices.

Important

Make sure you install matching versions of BigDL-LLM/pytorch/IPEX and oneAPI Base Toolkit. BigDL-LLM with Pytorch 2.1 should be used with oneAPI Base Toolkit version 2024.0. BigDL-LLM with Pytorch 2.0 should be used with oneAPI Base Toolkit version 2023.2.

conda create -n llm python=3.9
conda activate llm

pip install --pre --upgrade bigdl-llm[xpu] -f https://developer.intel.com/ipex-whl-stable-xpu

Note

The xpu option will install BigDL-LLM with PyTorch 2.1 by default, which is equivalent to

pip install --pre --upgrade bigdl-llm[xpu_2.1] -f https://developer.intel.com/ipex-whl-stable-xpu

Install BigDL-LLM From Wheel#

If you encounter network issues when installing IPEX, you can also install BigDL-LLM dependencies for Intel XPU from source archives. First you need to download and install torch/torchvision/ipex from wheels listed below before installing bigdl-llm.

# get the wheels on Linux system for IPEX 2.1.10+xpu
wget https://intel-extension-for-pytorch.s3.amazonaws.com/ipex_stable/xpu/torch-2.1.0a0%2Bcxx11.abi-cp39-cp39-linux_x86_64.whl
wget https://intel-extension-for-pytorch.s3.amazonaws.com/ipex_stable/xpu/torchvision-0.16.0a0%2Bcxx11.abi-cp39-cp39-linux_x86_64.whl
wget https://intel-extension-for-pytorch.s3.amazonaws.com/ipex_stable/xpu/intel_extension_for_pytorch-2.1.10%2Bxpu-cp39-cp39-linux_x86_64.whl

Then you may install directly from the wheel archives using following commands:

# install the packages from the wheels
pip install torch-2.1.0a0+cxx11.abi-cp39-cp39-linux_x86_64.whl
pip install torchvision-0.16.0a0+cxx11.abi-cp39-cp39-linux_x86_64.whl
pip install intel_extension_for_pytorch-2.1.10+xpu-cp39-cp39-linux_x86_64.whl

# install bigdl-llm for Intel GPU
pip install --pre --upgrade bigdl-llm[xpu]

Note

All the wheel packages mentioned here are for Python 3.9. If you would like to use Python 3.10 or 3.11, you should modify the wheel names for torch, torchvision, and intel_extension_for_pytorch by replacing cp39 with cp310 or cp311, respectively.

Runtime Configuration#

To use GPU acceleration on Linux, several environment variables are required or recommended before running a GPU example.

For Intel Arc™ A-Series Graphics and Intel Data Center GPU Flex Series, we recommend:

# Configure oneAPI environment variables. Required step for APT or offline installed oneAPI.
# Skip this step for PIP-installed oneAPI since the environment has already been configured in LD_LIBRARY_PATH.
source /opt/intel/oneapi/setvars.sh

# Recommended Environment Variables for optimal performance
export USE_XETLA=OFF
export SYCL_PI_LEVEL_ZERO_USE_IMMEDIATE_COMMANDLISTS=1

Known issues#

1. Potential suboptimal performance with Linux kernel 6.2.0#

For Ubuntu 22.04 and driver version < stable_775_20_20231219, the performance on Linux kernel 6.2.0 is worse than Linux kernel 5.19.0. You can use sudo apt update && sudo apt install -y intel-i915-dkms intel-fw-gpu to install the latest driver to solve this issue (need to reboot OS).

Tips: You can use sudo apt list --installed | grep intel-i915-dkms to check your intel-i915-dkms’s version, the version should be latest and >= 1.23.9.11.231003.15+i19-1.

2. Driver installation unmet dependencies error: intel-i915-dkms#

The last apt install command of the driver installation may produce the following error:

The following packages have unmet dependencies:
 intel-i915-dkms : Conflicts: intel-platform-cse-dkms
                   Conflicts: intel-platform-vsec-dkms

You can use sudo apt install -y intel-i915-dkms intel-fw-gpu to install instead. As the intel-platform-cse-dkms and intel-platform-vsec-dkms are already provided by intel-i915-dkms.

Troubleshooting#

1. Cannot open shared object file: No such file or directory#

Error where libmkl file is not found, for example,

OSError: libmkl_intel_lp64.so.2: cannot open shared object file: No such file or directory
Error: libmkl_sycl_blas.so.4: cannot open shared object file: No such file or directory

The reason for such errors is that oneAPI has not been initialized properly before running BigDL-LLM code or before importing IPEX package.

  • For oneAPI installed using APT or Offline Installer, make sure you execute setvars.sh of oneAPI Base Toolkit before running BigDL-LLM.

  • For PIP-installed oneAPI, activate your working environment and run echo $LD_LIBRARY_PATH to check if the installation path is properly configured for the environment. If the output does not contain oneAPI path (e.g. ~/intel/oneapi/lib), check Prerequisites to re-install oneAPI with PIP installer.

  • Make sure you install matching versions of BigDL-LLM/pytorch/IPEX and oneAPI Base Toolkit. BigDL-LLM with PyTorch 2.1 should be used with oneAPI Base Toolkit version 2024.0. BigDL-LLM with PyTorch 2.0 should be used with oneAPI Base Toolkit version 2023.2.