GaNDLF

Setup/Installation Instructions

Prerequisites

Alternatively, you can run GaNDLF via Docker. This needs different prerequisites. See the Docker Installation section below for more information.

Optional Requirements

Installation

The instructions assume a system using NVIDIA GPUs with CUDA 10.2 (for AMD, please make the appropriate change during PyTorch installation from their installation page).

git clone https://github.com/mlcommons/GaNDLF.git
cd GaNDLF
conda create -n venv_gandlf python=3.8 -y
conda activate venv_gandlf
### PyTorch installation - https://pytorch.org/get-started/locally
## CUDA 11.6
# pip install torch==1.13.1+cu116 torchvision==0.14.1+cu116 torchaudio==0.13.1 --extra-index-url https://download.pytorch.org/whl/cu116
## ROCm
# pip install torch==1.13.1+rocm5.2 torchvision==0.14.1+rocm5.2 torchaudio==0.13.1 --extra-index-url https://download.pytorch.org/whl/rocm5.2
## CPU-only
# pip install torch==1.13.1+cpu torchvision==0.14.1+cpu torchaudio==0.13.1 --extra-index-url https://download.pytorch.org/whl/cpu
pip install openvino-dev==2022.1.0 # [OPTIONAL] to generate optimized models for inference
pip install mlcube_docker # [OPTIONAL] to deploy GaNDLF models as MLCube-compliant Docker containers
pip install -e .

## alternatively you can also use:
# conda install -c pytorch pytorch torchvision -y
# conda install -c conda-forge gandlf -y

## verify installation
python ./gandlf_verifyInstall

Alternatively, GaNDLF can be installed via pip by running the following command:

pip install gandlf # this will give you the latest stable release

If you are interested in running the latest version of GaNDLF, you can install the nightly build by running the following command:

pip install --pre gandlf

Docker Installation

We provide containerized versions of GaNDLF, which allows you to run GaNDLF without worrying about installation steps or dependencies.

Steps to run the Docker version of GaNDLF

  1. Install the Docker Engine for your platform.
  2. GaNDLF is available from GitHub Package Registry. Several platform versions are available, including support for CUDA, ROCm, and CPU-only. Choose the one that best matches your system and drivers. For example, if you want to get the bleeding-edge GaNDLF version, and you have CUDA Toolkit v11.6, run the following command:

     docker pull ghcr.io/mlcommons/gandlf:latest-cuda116
    

This will download the GaNDLF image onto your machine. See the usage page for details on how to run GaNDLF in this “dockerized” form.

Enable GPU usage from Docker (optional, Linux only)

In order for “dockerized” GaNDLF to use your GPU, several steps are needed:

  1. Ensure sure that you have correct NVIDIA drivers for your GPU.
  2. Then, on Linux, follow the instructions to set up the NVIDIA Container Toolkit.
  3. This can be replicated for ROCm for AMD , by following the instructions to set up the ROCm Container Toolkit.

On Windows

On Windows, GPU and CUDA support requires either Windows 11, or (on Windows 10) to be registered for the Windows Insider program. If you meet those requirements and have current NVIDIA drivers, GPU support for Docker should work automatically. Otherwise, please try updating your Docker Desktop version.

Note: We cannot provide support for the Windows Insider program or for Docker Desktop itself.

Building your own GaNDLF Docker Image

You may also build a Docker image of GaNDLF from the source repository. Just specify the Dockerfile for your preferred GPU-compute platform (or CPU):

git clone https://github.com/mlcommons/GaNDLF.git
cd GaNDLF
docker build -t gandlf:$mytagname -f Dockerfile-$target_platform . # change $mytagname and $target_platform as needed