Page tree


CDO stands for Climate Data Operators. It is a collection of command-line operators to manipulate and analyse climate and numerical weather prediction data. Supported data formats are netCDF 3/4, GRIB 1/2 and other formats like SERVICE, EXTRA and IEG. It can also be used to analyse gridded data not related to climate science. It provides more than 350 operators.

More information:

How to use

You can check the versions installed in Gadi with a module query:

$ module avail cdo

We normally recommend using the latest version available and always recommend to specify the version number with the module command:

$ module load cdo/1.9.8

For more details on using modules see our software applications guide.

An example PBS job submission script named is provided below.

It requests 1 CPU, 2 GiB memory, and 8 GiB local disk on a compute node on Gadi from the normal queue  for 30 minutes against the project a00. It also requests the system to enter the working directory once the job is started. This script should be saved in the working directory from which the analysis will be done.

To change the number of CPU cores, memory, or jobfs required, simply modify the appropriate PBS resource requests at the top of this file according to the information in our queue structure guide.

Note that if your application does not work in parallel, setting the number of CPU cores to 1 and changing the memory and jobfs accordingly is required to prevent the compute resource waste.

#PBS -P a00
#PBS -q normal
#PBS -l ncpus=1
#PBS -l mem=2GB
#PBS -l jobfs=8GB
#PBS -l walltime=00:30:00
#PBS -l wd
# Load module, always specify version number.
module load cdo/1.9.8
# Must include `#PBS -l storage=scratch/ab12+gdata/yz98` if the job
# needs access to `/scratch/ab12/` and `/g/data/yz98/`
# Run CDO application
cdo [Options] [Operators]

For more information about cdo command's Options and Operators:

To run the job you would use the PBS command:

$ qsub

Authors: Mohsin Ali
  • No labels