Skip to content

Installation on Expanse

Timo Heister edited this page Jan 1, 2021 · 7 revisions

Author: [email protected]

Installation on SDSC-Expanse.

For details see the Expanse User Guide

Some notes:

  • to connect: ssh [email protected] followed by gsissh expanse.sdsc.xsede.org
  • Do not work/compile on the login node, but submit an interactive job:
    srun --partition=compute  --pty --account=ucd150 --nodes=1 --ntasks-per-node=1 --cpus-per-task=128 \
    --mem=248G -t 02:00:00 --wait=0 --export=ALL /bin/bash
    
    Note: You need to set tasks-per-node and cpus-per-task like this for compilation, otherwise you can only compile on a single core.

Filesystems:

  • /home/<userid>/ - NFS, not for parallel runs, 100GB
  • /scratch/<userid>/job_<jobid>/ - fast, local to the node, will be wiped
  • /expanse/lustre/projects/<projectnumber>/<userid>/ - parallel filesystem, good for parallel IO / large files.

Installation steps:

  1. Pick installation directory:
export DEST=/expanse/lustre/projects/<projectnumber>/<userid>/libs-v3
  1. Create and use enable.sh script
mkdir $DEST && cat > $DEST/enable.sh <<- EOM
  module reset
  module load slurm/expanse cpu gcc openmpi openblas cmake
  export CC=mpicc
  export CXX=mpicxx
  export FC=mpif77
EOM

. $DEST/enable.sh
  1. Install deal.II
git clone https://github.com/dealii/candi/
cd candi

cat >> candi.cfg <<- EOM
NATIVE_OPTIMIZATIONS=true
DEAL_CONFOPTS=" -D DEAL_II_COMPONENT_EXAMPLES=OFF "
TRILINOS_CONFOPTS=" -D TPL_BLAS_LIBRARIES=$OPENBLASHOME/lib/libopenblas.so -D TPL_LAPACK_LIBRARIES=$OPENBLASHOME/lib/libopenblas.so "
EOM

./candi.sh -j 64 -p $DEST --platform=deal.II-toolchain/platforms/supported/centos7.platform --packages="once:hdf5 once:p4est once:trilinos dealii"

echo ". $DEST/configuration/enable.sh" >> $DEST/enable.sh
  1. ASPECT
cd $HOME
. $DEST/enable.sh
git clone https://github.com/geodynamics/aspect.git
cd aspect
mkdir build
cd build
cmake ..
make release
make -j 50
  1. Running
mpirun -n 32 ./aspect bla.prm

Jobs

check the queue, submit a job, cancel a job:

squeue -u <username>
sbatch <script>
scancel <id>