HPCW 3.0
|
This page lists the different test cases available, with some Howto.
The current proposal is to provide 3 sizes of experiments per model:
The models contained in HPCW conform to this definition in varying degrees, depending on the utilized parallelization strategies. Refer to the respective sections in this document for more details.
HPCW contains many different applications, namely
cloudsc
)ectrans
)ecrad
)icon
)nemo
)nicamdc
)The spellings in the parentheses reflect the name of the respective project in the CMake files and the build wrapper.
ICON (ICOsahedral Nonhydrostatic) is a unified numerical weather prediction and climate modeling system, used for example in German weather forecasts. All ICON experiments in HPCW run on both CPUs and GPUs, unless explicitly stated otherwise.
Small tests:
icon-test-nwp-R02B04N06multi
(a numerical weather prediction test case with nested domains, ideal as a "health check" for the build; ~160km)icon-atm-tracer-Hadley
(idealized tracer transport test case; ~60km)icon-aes-physics
(a technical setup with land and atmosphere component, designed to fully utilize approximately one node; ~40km)Medium tests:
icon-NextGEMS-R2B8-2020
(a high-resolution NextGEMS setup; ~10km)Large tests:
icon-LAM
(an operational weather forecast setup covering Germany; ~2km with two nests (1km and 0.5km))Read also: ICON Performance hints
The Integrated Forecasting System (IFS) is a global numerical weather prediction system jointly developed and maintained by the European Centre for Medium-Range Weather Forecasts (ECMWF). The IFS developers have extracted several mini apps (or dwarf) from this global model (e.g., ecTrans, ecRad, etc.), and are also developing external prototypes before integration (e.g., FVM).
IFS dwarfs included in HPCW:
All ecRad test cases are constrained to a single node due to the sole use of OpenMP for parallelization. The available test cases are:
ecrad-small
(ecRad practical)ecrad-medium
ecrad-big
The nature of these test cases is described in more detail here.
Read also: ecRad Performance hints
The problem solved by CLOUDSC is embarrassingly parallel and the MPI variants only spawn similar workload on multiple ranks. So, it may be interesting to assess the impact of modifying the number of MPI processes/OpenMP threads per node, but it does not make sense to scale this beyond a single node. The available test cases are:
cloudsc-fortran-{small, medium, big}
cloudsc-gpu-{acc, omp, cuda, hip}
cloudsc-mpi-nproma{32, 64, 128, 256}
Read also: CloudSC Performance hints
The ecTrans test cases all share the same options, differing only in the spectral resolution. They are available as CPU and GPU versions. The available test cases are:
ectrans-cpu-{small, medium, big}
ectrans-gpu-{small, medium, big}
Read also: ecTrans Performance hints
The "Nucleus for European Modelling of the Ocean" (NEMO) is a state-of-the-art modelling framework. It is used for research activities and forecasting services in ocean and climate sciences.
Read also: NEMO Performance hints
NICAM-DC is the dynamical core of the Nonhydrostatic ICosahedral Atmospheric Model. The test cases are variations of a Jablonowski baroclinic wave test. The following configurations are available:
nicamdc-small
(gl05rl00z94pe10)nicamdc-medium
(gl08rl03z94pe640)nicamdc-big
(gl09rl04z94pe2560)Read also: NICAM-DC Performance hints