BigDL-LLM#
bigdl-llm
is a library for running LLM (large language model) on Intel XPU (from Laptop to GPU to Cloud) using INT4 with very low latency [1] (for any PyTorch model).
Get Started
Documents in these sections helps you getting started quickly with BigDL-LLM.
Key Features Guide
Each guide in this section provides you with in-depth information, concepts and knowledges about BigDL-LLM key features.
Examples & Tutorials
Examples contain scripts to help you quickly get started using BigDL-LLM to run some popular open-source models in the community.
API Document
API Document provides detailed description of BigDL-LLM APIs.
[1]
Performance varies by use, configuration and other factors. bigdl-llm
may not optimize to the same degree for non-Intel products. Learn more at www.Intel.com/PerformanceIndex.