Python User Guide#

Supported Platforms: Linux and macOS. For Windows, Refer to Windows User Guide.

1. Install#

  • We recommend using conda to prepare the Python environment as follows:

    conda create -n bigdl python=3.7  # "bigdl" is conda environment name, you can use any name you like.
    conda activate bigdl
  • You need to install JDK in the environment, and properly set the environment variable JAVA_HOME. JDK8 is highly recommended.

    You may take the following commands as a reference for installing OpenJDK:

    # For Ubuntu
    sudo apt-get install openjdk-8-jre
    export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64/
    # For CentOS
    su -c "yum install java-1.8.0-openjdk"
    export JAVA_HOME=/usr/lib/jvm/java-1.8.0-openjdk-
    export PATH=$PATH:$JAVA_HOME/bin
    java -version  # Verify the version of JDK.

1.1 Official Release#

You can install the latest release version of BigDL (built on top of Spark 2.4.6 by default) as follows:

pip install bigdl

Note: Installing BigDL will automatically install all the BigDL packages including bigdl-nano, bigdl-dllib, bigdl-orca, bigdl-chronos, bigdl-friesian, bigdl-serving and their dependencies if they haven’t been detected in your conda environment.

1.2 Nightly Build#

You can install the latest nightly build of BigDL as follows:

pip install --pre --upgrade bigdl

Alternatively, you can find the list of the nightly build versions here, and install a specific version as follows:

pip install bigdl==version

Note: If you are using a custom URL of Python Package Index, you may need to check whether the latest packages have been sync’ed with pypi. Or you can add the option -i when pip install to use pypi as the index-url.

You could uninstall all the packages of BigDL as follows:

pip uninstall bigdl-dllib bigdl-core bigdl-tf bigdl-math bigdl-orca bigdl-chronos bigdl-friesian bigdl-nano bigdl-serving bigdl

1.3 BigDL on Spark 3#

You can install BigDL built on top of Spark 3.1.3 as follows:

pip install bigdl-spark3  # Install the latest release version
pip install --pre --upgrade bigdl-spark3  # Install the latest nightly build version

You can find the list of the nightly build versions built on top of Spark 3.1.3 here.

You could uninstall all the packages of BigDL on Spark3 as follows:

pip uninstall bigdl-dllib-spark3 bigdl-core bigdl-tf bigdl-math bigdl-orca-spark3 bigdl-chronos-spark3 bigdl-friesian-spark3 bigdl-nano bigdl-serving bigdl-spark3

2. Run#

Note: Installing BigDL from pip will automatically install pyspark. To avoid possible conflicts, you are highly recommended to unset the environment variable SPARK_HOME if it exists in your environment.

2.1 Interactive Shell#

You may test if the installation is successful using the interactive Python shell as follows:

  • Type python in the command line to start a REPL.

  • Try to run the example code below to verify the installation:

    from bigdl.orca import init_orca_context
    sc = init_orca_context()  # Initiation of bigdl on the underlying cluster.

2.2 Jupyter Notebook#

You can start the Jupyter notebook as you normally do using the following command and run BigDL programs directly in a Jupyter notebook:

jupyter notebook --notebook-dir=./ --ip=* --no-browser

2.3 Python Script#

You can directly write BigDL programs in a Python file (e.g. and run in the command line as a normal Python program:


3. Python Dependencies#

We recommend using conda to manage your Python dependencies. Libraries installed in the current conda environment will be automatically distributed to the cluster when calling init_orca_context. You can also add extra dependencies as .py, .zip and .egg files by specifying extra_python_lib argument in init_orca_context.

For more details, please refer to Orca Context.

4. Compatibility#

BigDL has been tested on Python 3.6 and 3.7 with the following library versions:

pyspark==2.4.6 or 3.1.3
tensorflow==1.15.0 or >2.0

5. Known Issues#

  • If you meet the following error when pip install bigdl:

    ERROR: Could not find a version that satisfies the requirement pypandoc (from versions: none)
    ERROR: No matching distribution found for pypandoc
    Could not import pypandoc - required to package PySpark
    Traceback (most recent call last):
      File "/root/anaconda3/lib/python3.8/site-packages/setuptools/", line 126, in fetch_build_egg
      File "/root/anaconda3/lib/python3.8/", line 364, in check_call
        raise CalledProcessError(retcode, cmd)
    subprocess.CalledProcessError: Command '['/root/anaconda3/bin/python', '-m', 'pip', '--disable-pip-version-check', 'wheel', '--no-deps', '-w', '/tmp/tmprefr87ue', '--quiet', 'pypandoc']' returned non-zero exit status 1.

    This is actually caused by pip install pyspark in your Python environment. You can fix it by running pip install pypandoc first and then pip install bigdl.