Skip to content

MPI-parallel implementation of confined–unconfined aquifer system

License

Notifications You must be signed in to change notification settings

tudasc/CUAS-MPI

Repository files navigation

CUAS-MPI · License

What is CUAS-MPI?

CUAS-MPI is an MPI parallel implementation of the Confined-Unconfined Aquifer System model (CUAS18) for subglacial hydrology. The implementation relies on the parallel data structures and linear system solvers provided by the Portable, Extensible Toolkit for Scientific Computation (PETSc). The model uses a one-layer equivalent porous medium (EPM) approach for efficient (channel-like) and inefficient (cavity-like) subglacial water transport. A two-dimensional Darcy-type groundwater flow equation with spatially and temporarily varying hydraulic transmissivity is solved considering confined and unconfined aquifer conditions.

How to use it?

One of the integration tests can be used to generate a simple setup to explore the modelling choices and command line options. The example below needs ncks and ncap2 from the NCO toolkit to manipulate the NetCDF files.

# modifiy according to your installation
CUAS_BUILD_DIR=$CUAS_ROOT/CUAS-MPI/cmake-build-debug/

# number of MPI processes
NN=4

#
# generate simple input file from example integration test
#
exact=$CUAS_BUILD_DIR/test/cuascore/integration/test-exactSteadySolution
mpirun -n $NN $exact 1000.0 101 101 31 86400 out0.nc
# convert output to CUAS-MPI input and forcing file format
ncks -O -d time,-1 -v topg,bnd_mask,thk out0.nc input.nc
ncap2 -O -s "bmelt=watersource * 365*24*3600" out0.nc forcing.nc

# run a simple experiment 
#
cuas=$CUAS_BUILD_DIR/tools/cuas.exe

# set-up the solver
TOL="-ksp_rtol 1e-7 -ksp_atol 1e-15 -ksp_max_it 10000 -ksp_converged_use_min_initial_residual_norm"
export PETSC_OPTIONS="-options_left -ksp_initial_guess_nonzero -pc_type bjacobi -ksp_type gmres $TOL"

# make use of many options for this example
mpirun -n $NN  $cuas --totaltime '15 days' --dt '1 hour' --saveEvery 1 --verbose --outputSize large \
       --doChannels --Tmax 100  --Tmin 1.e-08 --initialHead Nzero  $opts \
       --conductivity 10 --layerThickness 0.1 \
       --flowConstant 3.4e-24 --cavityBeta 5e-4 --basalVelocityIce 1e-6 --supplyMultiplier 1.0 \
       --forcingFile forcing.nc  \
       input.nc output.nc

How to install?

Requirements

Build

Starting from the CUAS-MPI directory:

cmake -B build -DCMAKE_BUILD_TYPE=Release -DPETSC_DIR=<petsc-root-directory> -DNETCDF_DIR=<netcdf-root-directory>
cmake --build build
cmake --install build --prefix <prefix>

You may want to use legacy cmake calls to generate Makefiles and build CUAS-MPI:

mkdir build
cd build
cmake .. -DCMAKE_BUILD_TYPE=Release -DPETSC_DIR=<petsc-root-directory> -DNETCDF_DIR=<netcdf-root-directory> -DCMAKE_INSTALL_PREFIX=<prefix>
make
make install

CUAS-MPI makes use of the following options:

Option Default Description
CUAS_ENABLE_TESTS OFF Enables targets building tests.
CUAS_ENABLE_DOCS OFF Enables targets building documentation.

References

[CUAS18] Beyer, Sebastian and Kleiner, Thomas and Aizinger, Vadym and Rückamp, Martin and Humbert, Angelika A confined–unconfined aquifer model for subglacial hydrology and its application to the Northeast Greenland Ice Stream. In The Cryosphere, pages 3931–3947, 2018.
[CUAS23] Fischler, Yannic and Kleiner, Thomas and Bischof, Christian and Schmiedel, Jeremie and Sayag, Roiy and Emunds, Raban and Oestreich, Lennart Frederik and Humbert, Angelika A parallel implementation of the confined–unconfined aquifer system model for subglacial hydrology: design, verification, and performance analysis (CUAS-MPI v0.1.0) . In Geoscientific Model Development, pages 5305-5322, 2023.

CUAS-MPI Applications

About

MPI-parallel implementation of confined–unconfined aquifer system

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •  

Languages