Test environment for CUDA-aware MPI race detection.
- testsuite: Unit tests with data races
- jacobi Solver mini-app: CUDA-aware MPI C implementation by NVidia
- tealeaf mini-app: CUDA-aware MPI C++ heat conduction, see TeaLeaf
CMakeLists facilitates the execution of our tests by generating appropriate targets.
To setup, execute:
$ cd cusan-tests
$ mkdir build && cd build
$ cmake ..
CMakeLists will require the following environment variables:
CUSAN_PATH
: Path to the cusan installation, to detect the compiler wrappers/libsMUST_PATH
: PATH to the MUST installation, to detectmustrun
- System modules:
1) gcc/11.2.0 2) cuda/11.8 3) openmpi/4.1.6 4) git/2.40.0 5) python/3.10.10 6) clang/14.0.6
- CuSan: bootstrap
- MUST: bootstrap
- testsuite: llvm-lit, FileCheck
- GPU: Tesla T4 and Tesla V100 (arch=sm_70)
In root level created build
folder cmake ..
generates:
- Targets to build, and run on the current node, see CMakeLists:
make check-cutests
: Allmake check-cutests-mpi-to-cuda
: Only MPI to CUDA racesmake check-cutests-cuda-to-mpi
: Only CUDA to MPI racesmake check-cutests-cuda-only
: Only CUDA races
In root level created build
folder cmake ..
generates:
- Targets to build, and run on the current node, see CMakeLists:
make jacobi-all-build
: builds vanilla, vanilla-tsan, cusanmake jacobi-run
ormake jacobi-vanilla-run
: Run on current node
sbatch.sh
: To run on compute node (requiresmake jacobi-all-build
to be run)make jacobi-sbatch
: Slurm commit generated as sbatch
In root level created build
folder cmake ..
generates:
- Targets to build, and run on the current node, see CMakeLists:
make tealeaf-all-build
: builds vanilla, vanilla-tsan, cusanmake tealeaf-run
ormake tealeaf-vanilla-run
: Run on current node
sbatch.sh
: To run on compute node (requiresmake tealeaf-all-build
to be run)make tealeaf-sbatch
: Slurm commit generated as sbatch