Skip to content

Targeting GPUs

Targeting GPU is useful when you need to use a certain GPU model that meets your research requirement. There are two methods to target GPUs in your batch job script as shown below.

Note

If the resource you have requested is being used, your job will be placed in a queue until that resource is avilable.

Method 1

In the scenario where it's compulsory for the job to use a certain GPU, use the #SBATCH --constraint= parameter in the job submission script.

Example —

1
2
3
4
5
6
7
8
#SBATCH --output=%u.%j.out          # Where should the log files go?
                                    # You must provide an absolute path eg /common/home/module/username/
                                    # If no paths are provided, the output file will be placed in your current working directory
#SBATCH --requeue                   # Remove if you are not want the workload scheduler to requeue your job after preemption
#SBATCH --constraint=a40            # This tells the workload scheduler to provision you a40 nodes 
################################################################
## EDIT AFTER THIS LINE IF YOU ARE OKAY WITH DEFAULT SETTINGS ##
################################################################

Method 2

In the scenario where it's optional for the job to use a certain GPU, use the ##SBATCH --prefer= parameter in the job submission script. If the requested resource is unavailable, other resources will be assigned to the job.

1
2
3
4
5
6
7
8
#SBATCH --output=%u.%j.out          # Where should the log files go?
                                    # You must provide an absolute path eg /common/home/module/username/
                                    # If no paths are provided, the output file will be placed in your current working directory
#SBATCH --requeue                   # Remove if you are not want the workload scheduler to requeue your job after preemption
#SBATCH --prefer=a40                # This tells the workload scheduler to provision you a40 nodes at a best effort basis 
################################################################
## EDIT AFTER THIS LINE IF YOU ARE OKAY WITH DEFAULT SETTINGS ##
################################################################

Tags

If a GPU used that is preemptable, work must be checkpointed to avoid loss of data.

GPU Memory Tags Preemptable?
L40 48GB l40 or 48gb Y
A40 48GB a40 or 48gb N
V100 16GB v100 or 16gb N
P100 16GB p100 or 16gb N
3090 24GB 3090 or 24gb Y
A5000 24GB a5000 or 24gb Y
A100 40GB a100 or 40gb Y