Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

XUP-149732: DOC - Create documentation for how to setup 'shared' Spack instances #66

Open
mkandes opened this issue Apr 13, 2023 · 9 comments

Comments

@mkandes
Copy link
Member

mkandes commented Apr 13, 2023

Create separate documentation for users and SDSC HPC and CSSI team members, who will need to use 'shared' Spack instances as part of their testing and validation process before submitting pull requests to the sdsc/spack repo's deployment branches.

@mkandes
Copy link
Member Author

mkandes commented Sep 22, 2023

First, you need to configure your shared Spack instance.

[mkandes@login02 ~]$ sudo -u jpg_test ssh [email protected]
PIN+Yubi: 
Welcome to Bright release         9.0

                                                         Based on Rocky Linux 8
                                                                    ID: #000002

--------------------------------------------------------------------------------

                                 WELCOME TO
                  _______  __ ____  ___    _   _______ ______
                 / ____/ |/ // __ \/   |  / | / / ___// ____/
                / __/  |   // /_/ / /| | /  |/ /\__ \/ __/
               / /___ /   |/ ____/ ___ |/ /|  /___/ / /___
              /_____//_/|_/_/   /_/  |_/_/ |_//____/_____/

--------------------------------------------------------------------------------

Use the following commands to adjust your environment:

'module avail'            - show available modules
'module add <module>'     - adds a module to your environment for this session
'module initadd <module>' - configure module to be loaded at every login

-------------------------------------------------------------------------------
Last login: Fri Sep 22 08:57:07 2023 from 198.202.100.14
[jpg_test@login02 ~]$ ls -lahtr ~/.spack
ls: cannot access '/home/jpg_test/.spack': No such file or directory
[jpg_test@login02 ~]$ md5sum configure-shared-spack-instance.sh 
f9e10917568e113db02af16ba7801650  configure-shared-spack-instance.sh
[jpg_test@login02 ~]$ cat configure-shared-spack-instance.sh 
#!/usr/bin/env bash
#
# Configure a shared Spack instance in your local ~/.spack directory.

declare -xr SHARED_SPACK_VERSION='0.17.3'
declare -xr SHARED_SPACK_INSTANCE_NAME='gpu'
declare -xr SHARED_SPACK_INSTANCE_VERSION='b'
declare -xr SHARED_SPACK_ROOT="/cm/shared/apps/spack/${SHARED_SPACK_VERSION}/${SHARED_SPACK_INSTANCE_NAME}/${SHARED_SPACK_INSTANCE_VERSION}"

declare -xr LOCAL_SPACK_NAMESPACE="${USER}"
declare -xr LOCAL_SPACK_TMPDIR='/tmp'
declare -xr LOCAL_SPACK_ROOT="${HOME}/.spack/${SHARED_SPACK_VERSION}/${SHARED_SPACK_INSTANCE_NAME}/${SHARED_SPACK_INSTANCE_VERSION}/${SHARED_SPACK_USER}"

module reset
module list
. "${SHARED_SPACK_ROOT}/share/spack/setup-env.sh"
printenv

mkdir -p "${LOCAL_SPACK_ROOT}"

mkdir -p "${LOCAL_SPACK_ROOT}/var/spack/repos/${LOCAL_SPACK_NAMESPACE}/packages"
tee -a "${LOCAL_SPACK_ROOT}/var/spack/repos/${LOCAL_SPACK_NAMESPACE}/repo.yaml" << EOF
repo:
  namespace: ${LOCAL_SPACK_NAMESPACE}
EOF

mkdir -p "${LOCAL_SPACK_ROOT}/var/spack/stage"
mkdir -p "${LOCAL_SPACK_ROOT}/var/spack/cache"
mkdir -p "${LOCAL_SPACK_ROOT}/share/spack/modules"
mkdir -p "${LOCAL_SPACK_ROOT}/share/spack/lmod"
mkdir -p "${LOCAL_SPACK_ROOT}/opt/spack"

architecture='${ARCHITECTURE}'
compilername='${COMPILERNAME}'
compilerver='${COMPILERVER}'
package='${PACKAGE}'
version='${VERSION}'
hash='${HASH}'

mkdir -p  "${LOCAL_SPACK_ROOT}/etc/spack"
tee -a "${LOCAL_SPACK_ROOT}/config.yaml" << EOF
config:
  install_tree: 
    root: ${LOCAL_SPACK_ROOT}opt/spack
    projections:
      all: ${architecture}/${compilername}-${compilerver}/${package}-${version}-${hash}
  template_dirs:
    - ${SHARED_SPACK_ROOT}/share/spack/templates
  module_roots:
    tcl: ${LOCAL_SPACK_ROOT}share/spack/modules
    lmod: ${LOCAL_SPACK_ROOT}share/spack/lmod
  build_stage:
    - ${LOCAL_SPACK_ROOT}var/spack/stage
    - ${LOCAL_SPACK_TMPDIR}/${USER}/spack-stage
  source_cache: ${LOCAL_SPACK_ROOT}var/spack/cache
  misc_cache: ~/.spack/cache
  connect_timeout: 10
  verify_ssl: true
  suppress_gpg_warnings: false
  install_missing_compilers: false
  checksum: true
  dirty: false
  build_language: C
  locks: true
  build_jobs: 1
  ccache: false
  db_lock_timeout: 3
  package_lock_timeout: null
  shared_linking: 'rpath'
  allow_sgid: true
EOF

tee -a "${LOCAL_SPACK_ROOT}/repos.yaml" << EOF
repos:
  - ${LOCAL_SPACK_ROOT}var/spack/repos/${LOCAL_SPACK_NAMESPACE}
EOF

tee -a "${LOCAL_SPACK_ROOT}/upstreams.yaml" << EOF
upstreams:
  spack-instance-1:
    install_tree: ${SHARED_SPACK_ROOT}/opt/spack
EOF
[jpg_test@login02 ~]$ ./configure-shared-spack-instance.sh 
Resetting modules to system default. Reseting $MODULEPATH back to system default. All extra directories will be removed from $MODULEPATH.

Currently Loaded Modules:
  1) shared            3) slurm/expanse/21.08.8   5) DefaultModules
  2) cpu/0.17.3b (c)   4) sdsc/1.0

  Where:
   c:  built natively for AMD Rome

 

CONDA_SHLVL=0
LD_LIBRARY_PATH=/cm/shared/apps/slurm/current/lib64/slurm:/cm/shared/apps/slurm/current/lib64
LS_COLORS=rs=0:di=38;5;33:ln=38;5;51:mh=00:pi=40;38;5;11:so=38;5;13:do=38;5;5:bd=48;5;232;38;5;11:cd=48;5;232;38;5;3:or=48;5;232;38;5;9:mi=01;05;37;41:su=48;5;196;38;5;15:sg=48;5;11;38;5;16:ca=48;5;196;38;5;226:tw=48;5;10;38;5;16:ow=48;5;10;38;5;21:st=48;5;21;38;5;15:ex=38;5;40:*.tar=38;5;9:*.tgz=38;5;9:*.arc=38;5;9:*.arj=38;5;9:*.taz=38;5;9:*.lha=38;5;9:*.lz4=38;5;9:*.lzh=38;5;9:*.lzma=38;5;9:*.tlz=38;5;9:*.txz=38;5;9:*.tzo=38;5;9:*.t7z=38;5;9:*.zip=38;5;9:*.z=38;5;9:*.dz=38;5;9:*.gz=38;5;9:*.lrz=38;5;9:*.lz=38;5;9:*.lzo=38;5;9:*.xz=38;5;9:*.zst=38;5;9:*.tzst=38;5;9:*.bz2=38;5;9:*.bz=38;5;9:*.tbz=38;5;9:*.tbz2=38;5;9:*.tz=38;5;9:*.deb=38;5;9:*.rpm=38;5;9:*.jar=38;5;9:*.war=38;5;9:*.ear=38;5;9:*.sar=38;5;9:*.rar=38;5;9:*.alz=38;5;9:*.ace=38;5;9:*.zoo=38;5;9:*.cpio=38;5;9:*.7z=38;5;9:*.rz=38;5;9:*.cab=38;5;9:*.wim=38;5;9:*.swm=38;5;9:*.dwm=38;5;9:*.esd=38;5;9:*.jpg=38;5;13:*.jpeg=38;5;13:*.mjpg=38;5;13:*.mjpeg=38;5;13:*.gif=38;5;13:*.bmp=38;5;13:*.pbm=38;5;13:*.pgm=38;5;13:*.ppm=38;5;13:*.tga=38;5;13:*.xbm=38;5;13:*.xpm=38;5;13:*.tif=38;5;13:*.tiff=38;5;13:*.png=38;5;13:*.svg=38;5;13:*.svgz=38;5;13:*.mng=38;5;13:*.pcx=38;5;13:*.mov=38;5;13:*.mpg=38;5;13:*.mpeg=38;5;13:*.m2v=38;5;13:*.mkv=38;5;13:*.webm=38;5;13:*.ogm=38;5;13:*.mp4=38;5;13:*.m4v=38;5;13:*.mp4v=38;5;13:*.vob=38;5;13:*.qt=38;5;13:*.nuv=38;5;13:*.wmv=38;5;13:*.asf=38;5;13:*.rm=38;5;13:*.rmvb=38;5;13:*.flc=38;5;13:*.avi=38;5;13:*.fli=38;5;13:*.flv=38;5;13:*.gl=38;5;13:*.dl=38;5;13:*.xcf=38;5;13:*.xwd=38;5;13:*.yuv=38;5;13:*.cgm=38;5;13:*.emf=38;5;13:*.ogv=38;5;13:*.ogx=38;5;13:*.aac=38;5;45:*.au=38;5;45:*.flac=38;5;45:*.m4a=38;5;45:*.mid=38;5;45:*.midi=38;5;45:*.mka=38;5;45:*.mp3=38;5;45:*.mpc=38;5;45:*.ogg=38;5;45:*.ra=38;5;45:*.wav=38;5;45:*.oga=38;5;45:*.opus=38;5;45:*.spx=38;5;45:*.xspf=38;5;45:
CONDA_EXE=/home/jpg/anaconda3/bin/conda
LOCAL_SPACK_TMPDIR=/tmp
__LMOD_REF_COUNT_PATH=/cm/shared/apps/sdsc/1.0/bin:1;/cm/shared/apps/sdsc/1.0/sbin:1;/cm/shared/apps/slurm/current/sbin:1;/cm/shared/apps/slurm/current/bin:1;/home/jpg/anaconda3/condabin:1;/home/jpg_test/.local/bin:1;/home/jpg_test/bin:1;/usr/local/bin:1;/usr/bin:1;/usr/local/sbin:1;/usr/sbin:1
_ModuleTable002_=XT0yLHByb3BUPXthcmNoPXtbImNwdSJdPTEsfSx9LFsic3RhY2tEZXB0aCJdPTEsWyJzdGF0dXMiXT0iYWN0aXZlIixbInVzZXJOYW1lIl09ImNwdSIsfSxzZHNjPXtbImZuIl09Ii9jbS9zaGFyZWQvbW9kdWxlZmlsZXMvc2RzYy8xLjAubHVhIixbImZ1bGxOYW1lIl09InNkc2MvMS4wIixbImxvYWRPcmRlciJdPTQscHJvcFQ9e30sWyJzdGFja0RlcHRoIl09MSxbInN0YXR1cyJdPSJhY3RpdmUiLFsidXNlck5hbWUiXT0ic2RzYyIsfSxzaGFyZWQ9e1siZm4iXT0iL2NtL2xvY2FsL21vZHVsZWZpbGVzL3NoYXJlZCIsWyJmdWxsTmFtZSJdPSJzaGFyZWQiLFsibG9hZE9yZGVyIl09MSxwcm9wVD17fSxbInN0YWNrRGVwdGgiXT0xLFsic3RhdHVzIl09ImFjdGl2ZSIsWyJ1c2Vy
SSH_CONNECTION=198.202.100.14 57476 198.202.100.14 22
SPACK_PYTHON=/usr/bin/python3
LANG=en_US.UTF-8
HISTCONTROL=ignoredups
SDSC_BIN=/cm/shared/apps/sdsc/1.0/bin
SHARED_SPACK_ROOT=/cm/shared/apps/spack/0.17.3/gpu/b
SHARED_SPACK_INSTANCE_VERSION=b
HOSTNAME=login02
LMOD_SYSTEM_DEFAULT_MODULES=DefaultModules
__LMOD_REF_COUNT__LMFILES_=/cm/local/modulefiles/shared:1;/usr/share/modulefiles/cpu/0.17.3b.lua:1;/cm/local/modulefiles/slurm/expanse/21.08.8:1;/cm/shared/modulefiles/sdsc/1.0.lua:1;/usr/share/modulefiles/DefaultModules.lua:1
SDSC_DIR=/cm/shared/apps/sdsc/1.0
__LMOD_REF_COUNT_LD_LIBRARY_PATH=/cm/shared/apps/slurm/current/lib64/slurm:1;/cm/shared/apps/slurm/current/lib64:1
_ModuleTable004_=dWxlZmlsZXMiLCIvdXNyL3NoYXJlL01vZHVsZXMvbW9kdWxlZmlsZXMiLCIvY20vc2hhcmVkL21vZHVsZWZpbGVzIix9LFsic3lzdGVtQmFzZU1QQVRIIl09Ii9jbS9sb2NhbC9tb2R1bGVmaWxlczovY20vc2hhcmVkL2FwcHMvYWNjZXNzL21vZHVsZWZpbGVzOi9jbS9zaGFyZWQvbW9kdWxlZmlsZXM6L2V0Yy9tb2R1bGVmaWxlczovdXNyL3NoYXJlL21vZHVsZWZpbGVzOi91c3Ivc2hhcmUvTW9kdWxlcy9tb2R1bGVmaWxlcyIsfQ==
S_COLORS=auto
_CE_M=
which_declare=declare -f
XDG_SESSION_ID=146272
LOCAL_SPACK_ROOT=/home/jpg_test/.spack/0.17.3/gpu/b/
USER=jpg_test
SHARED_SPACK_VERSION=0.17.3
__LMOD_REF_COUNT_MODULEPATH=/cm/shared/apps/spack/0.17.3/cpu/b/share/spack/lmod/linux-rocky8-x86_64/Core:1;/cm/local/modulefiles:1;/cm/shared/apps/access/modulefiles:1;/etc/modulefiles:1;/usr/share/modulefiles:1;/usr/share/Modules/modulefiles:1;/cm/shared/modulefiles:3
__LMOD_REF_COUNT_LOADEDMODULES=shared:1;cpu/0.17.3b:1;slurm/expanse/21.08.8:1;sdsc/1.0:1;DefaultModules:1
PWD=/home/jpg_test
ENABLE_LMOD=1
HOME=/home/jpg_test
CONDA_PYTHON_EXE=/home/jpg/anaconda3/bin/python
LMOD_COLORIZE=yes
LMOD_SYSHOST=expanse
SSH_CLIENT=198.202.100.14 57476 22
LMOD_VERSION=8.2.4
CPATH=/cm/shared/apps/slurm/current/include
LMOD_SETTARG_CMD=:
BASH_ENV=/usr/share/lmod/lmod/init/bash
SDSC_SPACK_STACK=cpu
_CE_CONDA=
__LMOD_REF_COUNT_LIBRARY_PATH=/cm/shared/apps/slurm/current/lib64/slurm:1;/cm/shared/apps/slurm/current/lib64:1
LIBRARY_PATH=/cm/shared/apps/slurm/current/lib64/slurm:/cm/shared/apps/slurm/current/lib64
LMOD_sys=Linux
_ModuleTable001_=X01vZHVsZVRhYmxlXz17WyJNVHZlcnNpb24iXT0zLFsiY19yZWJ1aWxkVGltZSJdPWZhbHNlLFsiY19zaG9ydFRpbWUiXT1mYWxzZSxkZXB0aFQ9e30sZmFtaWx5PXt9LG1UPXtEZWZhdWx0TW9kdWxlcz17WyJmbiJdPSIvdXNyL3NoYXJlL21vZHVsZWZpbGVzL0RlZmF1bHRNb2R1bGVzLmx1YSIsWyJmdWxsTmFtZSJdPSJEZWZhdWx0TW9kdWxlcyIsWyJsb2FkT3JkZXIiXT01LHByb3BUPXt9LFsic3RhY2tEZXB0aCJdPTAsWyJzdGF0dXMiXT0iYWN0aXZlIixbInVzZXJOYW1lIl09IkRlZmF1bHRNb2R1bGVzIix9LGNwdT17WyJmbiJdPSIvdXNyL3NoYXJlL21vZHVsZWZpbGVzL2NwdS8wLjE3LjNiLmx1YSIsWyJmdWxsTmFtZSJdPSJjcHUvMC4xNy4zYiIsWyJsb2FkT3JkZXIi
SLURM_CONF=/cm/shared/apps/slurm/var/etc/expanse/slurm.conf
LOADEDMODULES=shared:cpu/0.17.3b:slurm/expanse/21.08.8:sdsc/1.0:DefaultModules
__LMOD_REF_COUNT_MANPATH=/cm/shared/apps/slurm/current/man:2;/usr/share/lmod/lmod/share/man:1;/usr/local/share/man:1;/usr/share/man:1;/cm/local/apps/environment-modules/current/share/man:1
LOCAL_SPACK_NAMESPACE=jpg_test
_ModuleTable003_=TmFtZSJdPSJzaGFyZWQiLH0sc2x1cm09e1siZm4iXT0iL2NtL2xvY2FsL21vZHVsZWZpbGVzL3NsdXJtL2V4cGFuc2UvMjEuMDguOCIsWyJmdWxsTmFtZSJdPSJzbHVybS9leHBhbnNlLzIxLjA4LjgiLFsibG9hZE9yZGVyIl09Myxwcm9wVD17fSxbInN0YWNrRGVwdGgiXT0yLFsic3RhdHVzIl09ImFjdGl2ZSIsWyJ1c2VyTmFtZSJdPSJzbHVybSIsfSx9LG1wYXRoQT17Ii9jbS9zaGFyZWQvYXBwcy9zcGFjay8wLjE3LjMvY3B1L2Ivc2hhcmUvc3BhY2svbG1vZC9saW51eC1yb2NreTgteDg2XzY0L0NvcmUiLCIvY20vbG9jYWwvbW9kdWxlZmlsZXMiLCIvY20vc2hhcmVkL2FwcHMvYWNjZXNzL21vZHVsZWZpbGVzIiwiL2V0Yy9tb2R1bGVmaWxlcyIsIi91c3Ivc2hhcmUvbW9k
LMOD_ROOT=/usr/share/lmod
SSH_TTY=/dev/pts/298
MAIL=/var/spool/mail/jpg_test
LMOD_arch=x86_64
__Init_Default_Modules=1
CMD_WLM_CLUSTER_NAME=expanse
SPACK_ROOT=/cm/shared/apps/spack/0.17.3/gpu/b
SHELL=/bin/bash
TERM=xterm-256color
_ModuleTable_Sz_=4
__LMOD_REF_COUNT_CPATH=/cm/shared/apps/slurm/current/include:1
SHARED_SPACK_INSTANCE_NAME=gpu
SDSC_LIB=/cm/shared/apps/sdsc/1.0/lib
SHLVL=2
SDSC_SBIN=/cm/shared/apps/sdsc/1.0/sbin
MANPATH=/cm/shared/apps/slurm/current/man:/usr/share/lmod/lmod/share/man::/usr/local/share/man:/usr/share/man:/cm/local/apps/environment-modules/current/share/man
LMOD_PREPEND_BLOCK=normal
MODULEPATH=/cm/shared/apps/spack/0.17.3/cpu/b/share/spack/lmod/linux-rocky8-x86_64/Core:/cm/local/modulefiles:/cm/shared/apps/access/modulefiles:/etc/modulefiles:/usr/share/modulefiles:/usr/share/Modules/modulefiles:/cm/shared/modulefiles
LOGNAME=jpg_test
DBUS_SESSION_BUS_ADDRESS=unix:path=/run/user/527971/bus
XDG_RUNTIME_DIR=/run/user/527971
MODULEPATH_ROOT=/usr/share/modulefiles
LMOD_PACKAGE_PATH=/usr/share/lmod/etc/site/lmod
PATH=/cm/shared/apps/spack/0.17.3/gpu/b/bin:/cm/shared/apps/sdsc/1.0/bin:/cm/shared/apps/sdsc/1.0/sbin:/cm/shared/apps/slurm/current/sbin:/cm/shared/apps/slurm/current/bin:/home/jpg/anaconda3/condabin:/home/jpg_test/.local/bin:/home/jpg_test/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin
_LMFILES_=/cm/local/modulefiles/shared:/usr/share/modulefiles/cpu/0.17.3b.lua:/cm/local/modulefiles/slurm/expanse/21.08.8:/cm/shared/modulefiles/sdsc/1.0.lua:/usr/share/modulefiles/DefaultModules.lua
DEBUGINFOD_URLS=https://debuginfod.centos.org/ 
MODULESHOME=/usr/share/lmod/lmod
LMOD_SETTARG_FULL_SUPPORT=no
HISTSIZE=1000
SDSC_INC=/cm/shared/apps/sdsc/1.0/include
LMOD_PKG=/usr/share/lmod/lmod
LMOD_CMD=/usr/share/lmod/lmod/libexec/lmod
LESSOPEN=||/usr/bin/lesspipe.sh %s
LMOD_FULL_SETTARG_SUPPORT=no
LMOD_DIR=/usr/share/lmod/lmod/libexec
BASH_FUNC_which%%=() {  ( alias;
 eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@
}
BASH_FUNC_module%%=() {  eval $($LMOD_CMD bash "$@") && eval $(${LMOD_SETTARG_CMD:-:} -s sh)
}
BASH_FUNC_spack%%=() {  : this is a shell function from: /cm/shared/apps/spack/0.17.3/gpu/b/share/spack/setup-env.sh;
 : the real spack script is here: /cm/shared/apps/spack/0.17.3/gpu/b/bin/spack;
 _spack_shell_wrapper "$@";
 return $?
}
BASH_FUNC__spack_shell_wrapper%%=() {  for var in LD_LIBRARY_PATH DYLD_LIBRARY_PATH DYLD_FALLBACK_LIBRARY_PATH;
 do
 eval "if [ -n \"\${${var}-}\" ]; then export SPACK_$var=\${${var}}; fi";
 done;
 if [ -n "${ZSH_VERSION:-}" ]; then
 emulate -L sh;
 fi;
 _sp_flags="";
 while [ ! -z ${1+x} ] && [ "${1#-}" != "${1}" ]; do
 _sp_flags="$_sp_flags $1";
 shift;
 done;
 if [ -n "$_sp_flags" ] && [ "${_sp_flags#*h}" != "${_sp_flags}" ] || [ "${_sp_flags#*V}" != "${_sp_flags}" ]; then
 command spack $_sp_flags "$@";
 return;
 fi;
 _sp_subcommand="";
 if [ ! -z ${1+x} ]; then
 _sp_subcommand="$1";
 shift;
 fi;
 case $_sp_subcommand in 
 "cd")
 _sp_arg="";
 if [ -n "$1" ]; then
 _sp_arg="$1";
 shift;
 fi;
 if [ "$_sp_arg" = "-h" ] || [ "$_sp_arg" = "--help" ]; then
 command spack cd -h;
 else
 LOC="$(spack location $_sp_arg "$@")";
 if [ -d "$LOC" ]; then
 cd "$LOC";
 else
 return 1;
 fi;
 fi;
 return
 ;;
 "env")
 _sp_arg="";
 if [ -n "$1" ]; then
 _sp_arg="$1";
 shift;
 fi;
 if [ "$_sp_arg" = "-h" ] || [ "$_sp_arg" = "--help" ]; then
 command spack env -h;
 else
 case $_sp_arg in 
 activate)
 _a=" $@";
 if [ -z ${1+x} ] || [ "${_a#* --sh}" != "$_a" ] || [ "${_a#* --csh}" != "$_a" ] || [ "${_a#* -h}" != "$_a" ] || [ "${_a#* --help}" != "$_a" ]; then
 command spack env activate "$@";
 else
 stdout="$(command spack $_sp_flags env activate --sh "$@")" || return;
 eval "$stdout";
 fi
 ;;
 deactivate)
 _a=" $@";
 if [ "${_a#* --sh}" != "$_a" ] || [ "${_a#* --csh}" != "$_a" ]; then
 command spack env deactivate "$@";
 else
 if [ -n "$*" ]; then
 command spack env deactivate -h;
 else
 stdout="$(command spack $_sp_flags env deactivate --sh)" || return;
 eval "$stdout";
 fi;
 fi
 ;;
 *)
 command spack env $_sp_arg "$@"
 ;;
 esac;
 fi;
 return
 ;;
 "load" | "unload")
 _a=" $@";
 if [ "${_a#* --sh}" != "$_a" ] || [ "${_a#* --csh}" != "$_a" ] || [ "${_a#* -h}" != "$_a" ] || [ "${_a#* --list}" != "$_a" ] || [ "${_a#* --help}" != "$_a" ]; then
 command spack $_sp_flags $_sp_subcommand "$@";
 else
 stdout="$(command spack $_sp_flags $_sp_subcommand --sh "$@")" || return;
 eval "$stdout";
 fi
 ;;
 *)
 command spack $_sp_flags $_sp_subcommand "$@"
 ;;
 esac
}
BASH_FUNC_ml%%=() {  eval $($LMOD_DIR/ml_cmd "$@")
}
_=/usr/bin/printenv
repo:
  namespace: jpg_test
config:
  install_tree: 
    root: /home/jpg_test/.spack/0.17.3/gpu/b/opt/spack
    projections:
      all: ${ARCHITECTURE}/${COMPILERNAME}-${COMPILERVER}/${PACKAGE}-${VERSION}-${HASH}
  template_dirs:
    - /cm/shared/apps/spack/0.17.3/gpu/b/share/spack/templates
  module_roots:
    tcl: /home/jpg_test/.spack/0.17.3/gpu/b/share/spack/modules
    lmod: /home/jpg_test/.spack/0.17.3/gpu/b/share/spack/lmod
  build_stage:
    - /home/jpg_test/.spack/0.17.3/gpu/b/var/spack/stage
    - /tmp/jpg_test/spack-stage
  source_cache: /home/jpg_test/.spack/0.17.3/gpu/b/var/spack/cache
  misc_cache: ~/.spack/cache
  connect_timeout: 10
  verify_ssl: true
  suppress_gpg_warnings: false
  install_missing_compilers: false
  checksum: true
  dirty: false
  build_language: C
  locks: true
  build_jobs: 1
  ccache: false
  db_lock_timeout: 3
  package_lock_timeout: null
  shared_linking: 'rpath'
  allow_sgid: true
repos:
  - /home/jpg_test/.spack/0.17.3/gpu/b/var/spack/repos/jpg_test
upstreams:
  spack-instance-1:
    install_tree: /cm/shared/apps/spack/0.17.3/gpu/b/opt/spack
[jpg_test@login02 ~]$

@mkandes
Copy link
Member Author

mkandes commented Sep 22, 2023

Once a shared instance is configured, you can activate it and run spack commands from your own user account for debugging, testing, and preparing production-ready Spack specs and/or packages that you want to submit as pull requests to the sdsc/spack repo.

[jpg_test@login02 ~]$ spack --version
-bash: spack: command not found
[jpg_test@login02 ~]$ md5sum activate-shared-spack-instance.sh 
f1b55debed9c7dcfdddc91cc089d4e39  activate-shared-spack-instance.sh
[jpg_test@login02 ~]$ cat activate-shared-spack-instance.sh 
#!/usr/bin/env bash
#
# Activate a shared Spack instance in your local ~/.spack configuration(s).

declare -xr SHARED_SPACK_VERSION='0.17.3'
declare -xr SHARED_SPACK_INSTANCE_NAME='gpu'
declare -xr SHARED_SPACK_INSTANCE_VERSION='b'
declare -xr SHARED_SPACK_ROOT="/cm/shared/apps/spack/${SHARED_SPACK_VERSION}/${SHARED_SPACK_INSTANCE_NAME}/${SHARED_SPACK_INSTANCE_VERSION}"

declare -xr LOCAL_SPACK_NAMESPACE="${USER}"
declare -xr LOCAL_SPACK_TMPDIR='/tmp'
declare -xr LOCAL_SPACK_ROOT="${HOME}/.spack/${SHARED_SPACK_VERSION}/${SHARED_SPACK_INSTANCE_NAME}/${SHARED_SPACK_INSTANCE_VERSION}/${SHARED_SPACK_USER}"

. "${SHARED_SPACK_ROOT}/share/spack/setup-env.sh"
module use "${LOCAL_SPACK_ROOT}/share/spack/lmod/linux-rocky8-x86_64"

alias spack="spack --config-scope ${LOCAL_SPACK_ROOT}"
[jpg_test@login02 ~]$ source activate-shared-spack-instance.sh 
[jpg_test@login02 ~]$ spack --version
0.17.3
[jpg_test@login02 ~]$ spack info dftbplus
MakefilePackage:   dftbplus

Description:
    DFTB+ is an implementation of the Density Functional based Tight Binding
    (DFTB) method, containing many extensions to the original method.

Homepage: https://www.dftbplus.org

Externally Detectable: 
    False

Tags: 
    None

Preferred version:  
    19.1    https://github.com/dftbplus/dftbplus/archive/19.1.tar.gz

Safe versions:  
    19.1    https://github.com/dftbplus/dftbplus/archive/19.1.tar.gz

Deprecated versions:  
    None

Variants:
    Name [Default]     When    Allowed values    Description
    ===============    ====    ==============    ===========================================================================================================================

    arpack [off]       --      on, off           Use ARPACK for excited state DFTB functionality
    dftd3 [off]        --      on, off           Use DftD3 dispersion library (if you need this dispersion model)
    elsi [off]         --      on, off           Use the ELSI library for large scale systems. Only has any effect if you build with '+mpi'
    gpu [off]          --      on, off           Use the MAGMA library for GPU accelerated computation
    mpi [on]           --      on, off           Build an MPI-paralelised version of the code.
    sockets [off]      --      on, off           Whether the socket library (external control) should be linked
    transport [off]    --      on, off           Whether transport via libNEGF should be included. Only affects parallel build. (serial version is built without
                                                 libNEGF/transport)

Installation Phases:
    edit    build    install

Build Dependencies:
    arpack-ng  blas  dftd3-lib  elsi  lapack  magma  mpi  scalapack

Link Dependencies:
    arpack-ng  blas  dftd3-lib  elsi  lapack  magma  mpi  scalapack

Run Dependencies:
    None

Virtual Packages: 
    None

[jpg_test@login02 ~]$

@mkandes
Copy link
Member Author

mkandes commented Oct 6, 2023

Let's work through an example use case.

Building and testing a software package from your user account and HOME directory using the shared instance configuration prior to deploying it into one of the production instances on Expanse.

For this example, we will use an open user request for a specific software package, namely, a quantum chemistry package called dftbplus, which, as you can see above, has an existing Spack package.

The request came in via the Access Ticketing System (ATS) a few weeks ago now. The ticket is available here: ATS-2938. After an initial investigation to determine whether or not the request could be fulfilled using Spack and deployed into the expanse/0.17.3/cpu/b production instance, a GitHub issue (#109) was created to document and track progress moving forward.

The request is from an SDSC user who previously ran this code on Comet. She has requested some specific optional dependencies to be included in the build of package. Since this package was not available via Rocks when she ran on Comet, we previously provided her with custom build scripts that compile the package with these optional dependencies. Unfortunately, while Spack does have a dftbplus Spack package that supports the requested optional dependencies, the package only supports up to dftbplus v19.1, which is now quite old. Moreover, the latest versions of dftbplus have changed quite a bit and have re-implemented how they support some of the optional components requested by the user. As such, we will have to update the Spack package to support the latest versions and fulfill this request using Spack.

Despite the challenges with the current Spack packaging, it may be worthwhile to build and test dftbplus v19.1 anyway to compare against the latest versions as we will first build for the user via a custom build script(s). So, let's get started ...

@mkandes
Copy link
Member Author

mkandes commented Oct 6, 2023

If you need to deploy a new application and/or update an existing one using Spack in one of SDSC's production instances, the first step is to check if we already have an existing spec (*.sh) build job script for a different variant of the application in the target deployment branch to use as a starting point. The easiest way to make this determination is search the sdsc/spack GitHub repository using the name of the application. For example, searching for dftbplus returns the following results:

https://github.com/search?q=repo%3Asdsc%2Fspack%20dftbplus&type=code

As you can see, there are a number of spec build scripts we can use as our starting point to run our build test using our shared Spack instance configuration. Note, however, none of these build specs were deployed into production on Expanse --- we only have a copy in the older cpu/0.15.4 instance at this time. So, there may have been an issue with these build specs, or maybe I simply chose not to redeploy dftbplus until we had more interest from users. I don't recall which is the case at this time.

[mkandes@login01 ~]$ module spider dftbplus

----------------------------------------------------------------------------
  dftbplus: dftbplus/19.1-openblas
----------------------------------------------------------------------------

    You will need to load all module(s) on any one of the lines below before the "dftbplus/19.1-openblas" module is available to load.

      cpu/0.15.4  intel/19.1.1.217  intel-mpi/2019.8.254
 
    Help:
      DFTB+ is an implementation of the Density Functional based Tight Binding
      (DFTB) method, containing many extensions to the original method.

[mkandes@login01 ~]$

dftbplus is not currently deployed on TSCC2 either.

[mkandes@login1 ~]$ module spider dftbplus
Lmod has detected the following error:  Unable to find: "dftbplus".



[mkandes@login1 ~]$

@mkandes
Copy link
Member Author

mkandes commented Oct 6, 2023

Let's go ahead and select this spec build as our starting point, downloading it to our HOME directory.

wget https://raw.githubusercontent.com/sdsc/spack/1297919e45f9dcc11e626ee3cccd4d5181b0a5e1/etc/spack/sdsc/expanse/0.17.3/cpu/b/specs/gcc%4010.2.0/openmpi%404.1.3/dftbplus%4019.1.sh

@mkandes
Copy link
Member Author

mkandes commented Oct 6, 2023

And for reference, this is the state of the spec build script at this time, which may clearly change as part of this test build process.

[mkandes@login02 ~]$ wget https://raw.githubusercontent.com/sdsc/spack/1297919e45f9dcc11e626ee3cccd4d5181b0a5e1/etc/spack/sdsc/expanse/0.17.3/cpu/b/specs/gcc%4010.2.0/openmpi%404.1.3/dftbplus%4019.1.sh
--2023-10-06 14:49:51--  https://raw.githubusercontent.com/sdsc/spack/1297919e45f9dcc11e626ee3cccd4d5181b0a5e1/etc/spack/sdsc/expanse/0.17.3/cpu/b/specs/gcc%4010.2.0/openmpi%404.1.3/dftbplus%4019.1.sh
Resolving raw.githubusercontent.com (raw.githubusercontent.com)... 185.199.111.133, 185.199.110.133, 185.199.108.133, ...
Connecting to raw.githubusercontent.com (raw.githubusercontent.com)|185.199.111.133|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: 2313 (2.3K) [text/plain]
Saving to: ‘[email protected][email protected]    100%[===================>]   2.26K  --.-KB/s    in 0s      

2023-10-06 14:49:52 (58.8 MB/s) - ‘[email protected]’ saved [2313/2313]

[mkandes@login02 ~]$ vi [email protected] 
[mkandes@login02 ~]$ cat [email protected] 
#!/usr/bin/env bash

#SBATCH [email protected]
#SBATCH --account=use300
#SBATCH --reservation=rocky8u7_testing
#SBATCH --partition=ind-shared
#SBATCH --nodes=1
#SBATCH --ntasks-per-node=1
#SBATCH --cpus-per-task=16
#SBATCH --mem=32G
#SBATCH --time=00:30:00
#SBATCH --output=%x.o%j.%N

declare -xr LOCAL_TIME="$(date +'%Y%m%dT%H%M%S%z')"
declare -xir UNIX_TIME="$(date +'%s')"

declare -xr LOCAL_SCRATCH_DIR="/scratch/${USER}/job_${SLURM_JOB_ID}"
declare -xr TMPDIR="${LOCAL_SCRATCH_DIR}"

declare -xr SYSTEM_NAME='expanse'

declare -xr SPACK_VERSION='0.17.3'
declare -xr SPACK_INSTANCE_NAME='cpu'
declare -xr SPACK_INSTANCE_VERSION='b'
declare -xr SPACK_INSTANCE_DIR="/cm/shared/apps/spack/${SPACK_VERSION}/${SPACK_INSTANCE_NAME}/${SPACK_INSTANCE_VERSION}"

declare -xr SLURM_JOB_SCRIPT="$(scontrol show job ${SLURM_JOB_ID} | awk -F= '/Command=/{print $2}')"
declare -xr SLURM_JOB_MD5SUM="$(md5sum ${SLURM_JOB_SCRIPT})"

declare -xr SCHEDULER_MODULE='slurm'

echo "${UNIX_TIME} ${SLURM_JOB_ID} ${SLURM_JOB_MD5SUM} ${SLURM_JOB_DEPENDENCY}" 
echo ""

cat "${SLURM_JOB_SCRIPT}"

module purge
module load "${SCHEDULER_MODULE}"
module list
. "${SPACK_INSTANCE_DIR}/share/spack/setup-env.sh"

declare -xr SPACK_PACKAGE='[email protected]'
declare -xr SPACK_COMPILER='[email protected]'
declare -xr SPACK_VARIANTS='+arpack +dftd3 ~elsi ~gpu +mpi ~sockets ~transport'
declare -xr SPACK_DEPENDENCIES="^[email protected]/$(spack find --format '{hash:7}' [email protected] % ${SPACK_COMPILER}) ^[email protected]/$(spack find --format '{hash:7}' [email protected] % ${SPACK_COMPILER} ^[email protected])"
declare -xr SPACK_SPEC="${SPACK_PACKAGE} % ${SPACK_COMPILER} ${SPACK_VARIANTS} ${SPACK_DEPENDENCIES}"

printenv

spack config get compilers
spack config get config  
spack config get mirrors
spack config get modules
spack config get packages
spack config get repos
spack config get upstreams

time -p spack spec --long --namespaces --types "${SPACK_SPEC}"
if [[ "${?}" -ne 0 ]]; then
  echo 'ERROR: spack concretization failed.'
  exit 1
fi

time -p spack install --jobs "${SLURM_CPUS_PER_TASK}" --fail-fast --yes-to-all "${SPACK_SPEC}"
if [[ "${?}" -ne 0 ]]; then
  echo 'ERROR: spack install failed.'
  exit 1
fi

#spack module lmod refresh --delete-tree -y

#sbatch --dependency="afterok:${SLURM_JOB_ID}" ''

sleep 30
[mkandes@login02 ~]$

@mkandes
Copy link
Member Author

mkandes commented Oct 6, 2023

To run the spec build job script in the shared Spack instance configuration, you only need to make minor modifications. Most importantly, you need to source the activation script.

[mkandes@login02 ~]$ cat [email protected] 
#!/usr/bin/env bash

#SBATCH [email protected]
#SBATCH --account=use300
##SBATCH --reservation=root_73
#SBATCH --partition=ind-shared
#SBATCH --nodes=1
#SBATCH --ntasks-per-node=1
#SBATCH --cpus-per-task=16
#SBATCH --mem=32G
#SBATCH --time=00:30:00
#SBATCH --output=%x.o%j.%N

declare -xr LOCAL_TIME="$(date +'%Y%m%dT%H%M%S%z')"
declare -xir UNIX_TIME="$(date +'%s')"

declare -xr LOCAL_SCRATCH_DIR="/scratch/${USER}/job_${SLURM_JOB_ID}"
declare -xr TMPDIR="${LOCAL_SCRATCH_DIR}"

declare -xr SYSTEM_NAME='expanse'

declare -xr SPACK_VERSION='0.17.3'
declare -xr SPACK_INSTANCE_NAME='cpu'
declare -xr SPACK_INSTANCE_VERSION='b'
declare -xr SPACK_INSTANCE_DIR="/cm/shared/apps/spack/${SPACK_VERSION}/${SPACK_INSTANCE_NAME}/${SPACK_INSTANCE_VERSION}"

declare -xr SLURM_JOB_SCRIPT="$(scontrol show job ${SLURM_JOB_ID} | awk -F= '/Command=/{print $2}')"
declare -xr SLURM_JOB_MD5SUM="$(md5sum ${SLURM_JOB_SCRIPT})"

declare -xr SCHEDULER_MODULE='slurm'

echo "${UNIX_TIME} ${SLURM_JOB_ID} ${SLURM_JOB_MD5SUM} ${SLURM_JOB_DEPENDENCY}" 
echo ""

cat "${SLURM_JOB_SCRIPT}"

module purge
module load "${SCHEDULER_MODULE}"
module list
#. "${SPACK_INSTANCE_DIR}/share/spack/setup-env.sh"
shopt -s expand_aliases
source activate-shared-spack-instance.sh

declare -xr SPACK_PACKAGE='[email protected]'
declare -xr SPACK_COMPILER='[email protected]'
declare -xr SPACK_VARIANTS='+arpack +dftd3 ~elsi ~gpu +mpi ~sockets ~transport'
declare -xr SPACK_DEPENDENCIES="^[email protected]/$(spack find --format '{hash:7}' [email protected] % ${SPACK_COMPILER}) ^[email protected]/$(spack find --format '{hash:7}' [email protected] % ${SPACK_COMPILER} ^[email protected])"
declare -xr SPACK_SPEC="${SPACK_PACKAGE} % ${SPACK_COMPILER} ${SPACK_VARIANTS} ${SPACK_DEPENDENCIES}"

printenv

spack config get compilers
spack config get config  
spack config get mirrors
spack config get modules
spack config get packages
spack config get repos
spack config get upstreams

time -p spack spec --long --namespaces --types "${SPACK_SPEC}"
if [[ "${?}" -ne 0 ]]; then
  echo 'ERROR: spack concretization failed.'
  exit 1
fi

time -p spack install --jobs "${SLURM_CPUS_PER_TASK}" --fail-fast --yes-to-all "${SPACK_SPEC}"
if [[ "${?}" -ne 0 ]]; then
  echo 'ERROR: spack install failed.'
  exit 1
fi

#spack module lmod refresh --delete-tree -y

#sbatch --dependency="afterok:${SLURM_JOB_ID}" ''

sleep 30
[mkandes@login02 ~]$

@mkandes
Copy link
Member Author

mkandes commented Oct 6, 2023

Unfortunately, while the supporting dftd3-lib library is build and installed successfully, the build of dftbplus itself failed when the spec build job script was run.

Concretized
--------------------------------
==> Bootstrapping clingo from pre-built binaries
==> Warning: Skipping package at /cm/shared/apps/spack/0.17.3/cpu/b/var/spack/repos/sdsc/packages/amber.configure. "amber.configure" is not a valid Spack module name.
uftonhm  [    ]  [email protected]%[email protected] cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" +arpack+dftd3~elsi~gpu+mpi~sockets~transport arch=linux-rocky8-zen2
ccwydfp  [bl  ]      ^[email protected]%[email protected] cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~mpi+shared arch=linux-rocky8-zen2
fgk2tlu  [bl  ]          ^[email protected]%[email protected] cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~bignuma~consistent_fpcsr~ilp64+locking+pic+shared threads=none arch=linux-rocky8-zen2
3slbnms  [bl  ]      ^[email protected]%[email protected] cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native"  arch=linux-rocky8-zen2
pywku55  [bl  ]      ^[email protected]%[email protected] cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~ipo+pic+shared build_type=Release patches=1c9ce5fee1451a08c2de3cc87f446aeda0b818ebbce4ad0d980ddf2f2a0b2dc4,f2baedde688ffe4c20943c334f580eb298e04d6f35c86b90a1f4e8cb7ae344a2 arch=linux-rocky8-zen2
oq3qvsv  [bl  ]          ^[email protected]%[email protected] cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~atomics~cuda~cxx~cxx_exceptions~gpfs~internal-hwloc~java+legacylaunchers+lustre~memchecker+pmi+pmix+romio~rsh~singularity+static+vt+wrapper-rpath cuda_arch=none fabrics=ucx schedulers=slurm arch=linux-rocky8-zen2
7rqkdv4  [bl  ]              ^[email protected]%[email protected] cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~cairo~cuda~gl~libudev+libxml2~netloc~nvml~opencl+pci~rocm+shared arch=linux-rocky8-zen2
ykynzrw  [bl  ]                  ^[email protected]%[email protected] cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native"  arch=linux-rocky8-zen2
mgovjpj  [bl  ]                  ^[email protected]%[email protected] cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~python arch=linux-rocky8-zen2
zduoj2d  [bl  ]                      ^[email protected]%[email protected] cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native"  libs=shared,static arch=linux-rocky8-zen2
paz7hxz  [bl  ]                      ^[email protected]%[email protected] cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~pic libs=shared,static arch=linux-rocky8-zen2
ws4iari  [bl  ]                      ^[email protected]%[email protected] cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" +optimize+pic+shared arch=linux-rocky8-zen2
5lhvslt  [bl  ]                  ^[email protected]%[email protected] cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~symlinks+termlib abi=none arch=linux-rocky8-zen2
bimlmtn  [bl  ]              ^[email protected]%[email protected] cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~openssl arch=linux-rocky8-zen2
fy2cjdg  [bl  ]              ^[email protected]%[email protected] cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native"  arch=linux-rocky8-zen2
ckhyr5e  [bl  ]              ^[email protected]%[email protected] cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native"  patches=4e1d78cbbb85de625bad28705e748856033eaafab92a66dffd383a3d7e00cc94,62fc8a8bf7665a60e8f4c93ebbd535647cebf74198f7afafec4c085a8825c006,ff37630df599cfabf0740518b91ec8daaf18e8f288b19adaae5364dc1f6b2296 arch=linux-rocky8-zen2
dpvrfip  [bl  ]              ^[email protected]%[email protected] cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~docs+pmi_backwards_compatibility~restful arch=linux-rocky8-zen2
4kvl3fd  [bl  ]              ^[email protected]%[email protected] cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~gtk~hdf5~hwloc~mariadb~pmix+readline~restd sysconfdir=PREFIX/etc arch=linux-rocky8-zen2
dnpjjuc  [bl  ]              ^[email protected]%[email protected] cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~assertions~cm+cma~cuda+dc~debug+dm~gdrcopy+ib-hw-tm~java~knem~logging+mlx5-dv+optimizations~parameter_checking+pic+rc~rocm+thread_multiple+ud~xpmem cuda_arch=none arch=linux-rocky8-zen2
xjr3cuj  [bl  ]                  ^[email protected]%[email protected] cflags="-O2 -march=native" cxxflags="-O2 -march=native" fflags="-O2 -march=native" ~ipo build_type=RelWithDebInfo arch=linux-rocky8-zen2

real 43.55
user 20.39
sys 1.89
[+] /cm/shared/apps/spack/0.17.3/cpu/b/opt/spack/linux-rocky8-zen2/gcc-10.2.0/openblas-0.3.18-fgk2tlu7bbymzbnkresz4zt5tyfmnbbs
==> Installing dftd3-lib-0.9.2-3slbnms6dkjz42qdo7q55zhi36xcqork
==> No binary for dftd3-lib-0.9.2-3slbnms6dkjz42qdo7q55zhi36xcqork found: installing from source
==> Warning: Skipping package at /cm/shared/apps/spack/0.17.3/cpu/b/var/spack/repos/sdsc/packages/amber.configure. "amber.configure" is not a valid Spack module name.
==> Fetching https://mirror.spack.io/_source-cache/archive/41/4178f3cf2f3e7e982a7084ec66bac92b4fdf164537d9fc0ada840a11b784f0e0.tar.gz
==> No patches needed for dftd3-lib
==> dftd3-lib: Executing phase: 'edit'
==> dftd3-lib: Executing phase: 'build'
==> dftd3-lib: Executing phase: 'install'
==> dftd3-lib: Successfully installed dftd3-lib-0.9.2-3slbnms6dkjz42qdo7q55zhi36xcqork
  Fetch: 0.13s.  Build: 11.21s.  Total: 11.35s.
[+] /home/mkandes/.spack/0.17.3/cpu/b/opt/spack/linux-rocky8-zen2/gcc-10.2.0/dftd3-lib-0.9.2-3slbnms6dkjz42qdo7q55zhi36xcqork
[+] /cm/shared/apps/spack/0.17.3/cpu/b/opt/spack/linux-rocky8-zen2/gcc-10.2.0/libpciaccess-0.16-ykynzrw4owqljbwyhyouig4rbf7hxdqz
[+] /cm/shared/apps/spack/0.17.3/cpu/b/opt/spack/linux-rocky8-zen2/gcc-10.2.0/libiconv-1.16-zduoj2duq26hlfta4shqtafuq42gp3rq
[+] /cm/shared/apps/spack/0.17.3/cpu/b/opt/spack/linux-rocky8-zen2/gcc-10.2.0/xz-5.2.5-paz7hxzjnp6khsfch7dm66ytubmw5v5j
[+] /cm/shared/apps/spack/0.17.3/cpu/b/opt/spack/linux-rocky8-zen2/gcc-10.2.0/zlib-1.2.11-ws4iari52j2lphd52i7kd72yj37o32zt
[+] /cm/shared/apps/spack/0.17.3/cpu/b/opt/spack/linux-rocky8-zen2/gcc-10.2.0/ncurses-6.2-5lhvsltgtbpsak4szzorveqvxke32sog
[+] /cm/shared/apps/spack/0.17.3/cpu/b/opt/spack/linux-rocky8-zen2/gcc-10.2.0/libevent-2.1.8-bimlmtn2x74wxpfxjy6yioltrzjdmeio
[+] /usr (external lustre-2.15.2-fy2cjdg3vpb7jgbukx3jn7zdrmv76qug)
[+] /cm/shared/apps/spack/0.17.3/cpu/b/opt/spack/linux-rocky8-zen2/gcc-10.2.0/numactl-2.0.14-ckhyr5ezo2f4eykrtm22265ygawjthpu
[+] /cm/shared/apps/slurm/21.08.8 (external slurm-21.08.8-4kvl3fdg7wli3u5r5yxonwfgfbsy7uzd)
[+] /usr (external rdma-core-43.0-xjr3cujqzl4uw2vb2jt6lst6z3gnjhqo)
[+] /cm/shared/apps/spack/0.17.3/cpu/b/opt/spack/linux-rocky8-zen2/gcc-10.2.0/arpack-ng-3.8.0-ccwydfpt6l4cxgkad2voxxhs5maisgas
[+] /cm/shared/apps/spack/0.17.3/cpu/b/opt/spack/linux-rocky8-zen2/gcc-10.2.0/libxml2-2.9.12-mgovjpj47znwb45ityck57ufrgu57mcc
[+] /cm/shared/apps/spack/0.17.3/cpu/b/opt/spack/linux-rocky8-zen2/gcc-10.2.0/ucx-1.10.1-dnpjjucppo5hjn4wln4bbekczzk7covs
[+] /cm/shared/apps/spack/0.17.3/cpu/b/opt/spack/linux-rocky8-zen2/gcc-10.2.0/hwloc-2.6.0-7rqkdv4vgf63waqaftjer77mqpbwrrok
[+] /cm/shared/apps/spack/0.17.3/cpu/b/opt/spack/linux-rocky8-zen2/gcc-10.2.0/pmix-3.2.1-dpvrfipkueh55vsqz2k6z2bmumrwy4s5
[+] /cm/shared/apps/spack/0.17.3/cpu/b/opt/spack/linux-rocky8-zen2/gcc-10.2.0/openmpi-4.1.3-oq3qvsvt5mywjzy7xzrfeh6eebiujvbm
[+] /cm/shared/apps/spack/0.17.3/cpu/b/opt/spack/linux-rocky8-zen2/gcc-10.2.0/netlib-scalapack-2.1.0-pywku55redapqs6qxqofuvl67kz5ppks
==> Installing dftbplus-19.1-uftonhmduenlibag2hyy44okp2sr2njk
==> No binary for dftbplus-19.1-uftonhmduenlibag2hyy44okp2sr2njk found: installing from source
==> Fetching https://mirror.spack.io/_source-cache/archive/4d/4d07f5c6102f06999d8cfdb1d17f5b59f9f2b804697f14b3bc562e3ea094b8a8.tar.gz
==> Fetching https://mirror.spack.io/_source-cache/archive/bd/bd191b3d240c1a81a8754a365e53a78b581fc92eb074dd5beb8b56a669a8d3d1.tar.gz
==> Moving resource stage
	source: /home/mkandes/.spack/0.17.3/cpu/b/var/spack/stage/resource-slakos-uftonhmduenlibag2hyy44okp2sr2njk/spack-src/
	destination: /home/mkandes/.spack/0.17.3/cpu/b/var/spack/stage/spack-stage-dftbplus-19.1-uftonhmduenlibag2hyy44okp2sr2njk/spack-src/external/slakos/testparams-dftbplus-18.2
==> No patches needed for dftbplus
==> dftbplus: Executing phase: 'edit'
==> dftbplus: Executing phase: 'build'
==> Error: ProcessError: Command exited with status 2:
    'make' '-j16'

6 errors found in build log:
     44        FXX="/cm/shared/apps/spack/0.17.3/cpu/b/opt/spack/linux-rocky8-z
           en2/gcc-10.2.0/openmpi-4.1.3-oq3qvsvt5mywjzy7xzrfeh6eebiujvbm/bin/mp
           if90" FXXOPT="-O2 -funroll-all-loops -fopenmp -fall-intrinsics" \
     45                M4="m4" M4OPT="" SRCDIR="/home/mkandes/.spack/0.17.3/cpu
           /b/var/spack/stage/spack-stage-dftbplus-19.1-uftonhmduenlibag2hyy44o
           kp2sr2njk/spack-src/external/scalapackfx/origin/src"
     46    make -C /home/mkandes/.spack/0.17.3/cpu/b/var/spack/stage/spack-stag
           e-dftbplus-19.1-uftonhmduenlibag2hyy44okp2sr2njk/spack-src/_build/ex
           ternal/mpifx \
     47              -f /home/mkandes/.spack/0.17.3/cpu/b/var/spack/stage/spack
           -stage-dftbplus-19.1-uftonhmduenlibag2hyy44okp2sr2njk/spack-src/exte
           rnal/mpifx/make.dpbuild \
     48              ROOT=/home/mkandes/.spack/0.17.3/cpu/b/var/spack/stage/spa
           ck-stage-dftbplus-19.1-uftonhmduenlibag2hyy44okp2sr2njk/spack-src BU
           ILDROOT=/home/mkandes/.spack/0.17.3/cpu/b/var/spack/stage/spack-stag
           e-dftbplus-19.1-uftonhmduenlibag2hyy44okp2sr2njk/spack-src/_build
     49    make[2]: Entering directory '/home/mkandes/.spack/0.17.3/cpu/b/var/s
           pack/stage/spack-stage-dftbplus-19.1-uftonhmduenlibag2hyy44okp2sr2nj
           k/spack-src/_build/external/scalapackfx'
  >> 50    make[2]: /home/mkandes/.spack/0.17.3/cpu/b/var/spack/stage/spack-sta
           ge-dftbplus-19.1-uftonhmduenlibag2hyy44okp2sr2njk/spack-src/external
           /scalapackfx/origin/src/Makefile.lib: No such file or directory
     51    make[1]: Entering directory '/home/mkandes/.spack/0.17.3/cpu/b/var/s
           pack/stage/spack-stage-dftbplus-19.1-uftonhmduenlibag2hyy44okp2sr2nj
           k/spack-src/_build/external/mpifx'
     52    make -C /home/mkandes/.spack/0.17.3/cpu/b/var/spack/stage/spack-stag
           e-dftbplus-19.1-uftonhmduenlibag2hyy44okp2sr2njk/spack-src/_build/ex
           ternal/mpifx -f /home/mkandes/.spack/0.17.3/cpu/b/var/spack/stage/sp
           ack-stage-dftbplus-19.1-uftonhmduenlibag2hyy44okp2sr2njk/spack-src/e
           xternal/mpifx/origin/src/Makefile.lib \
     53        FXX="/cm/shared/apps/spack/0.17.3/cpu/b/opt/spack/linux-rocky8-z
           en2/gcc-10.2.0/openmpi-4.1.3-oq3qvsvt5mywjzy7xzrfeh6eebiujvbm/bin/mp
           if90" FXXOPT="-O2 -funroll-all-loops -fopenmp -fall-intrinsics" INCL
           UDES="" \
     54                M4="m4" M4OPT="" SRCDIR="/home/mkandes/.spack/0.17.3/cpu
           /b/var/spack/stage/spack-stage-dftbplus-19.1-uftonhmduenlibag2hyy44o
           kp2sr2njk/spack-src/external/mpifx/origin/src"
  >> 55    make[2]: *** No rule to make target '/home/mkandes/.spack/0.17.3/cpu
           /b/var/spack/stage/spack-stage-dftbplus-19.1-uftonhmduenlibag2hyy44o
           kp2sr2njk/spack-src/external/scalapackfx/origin/src/Makefile.lib'.  
           Stop.
     56    make[2]: Leaving directory '/home/mkandes/.spack/0.17.3/cpu/b/var/sp
           ack/stage/spack-stage-dftbplus-19.1-uftonhmduenlibag2hyy44okp2sr2njk
           /spack-src/_build/external/scalapackfx'
  >> 57    make[1]: *** [/home/mkandes/.spack/0.17.3/cpu/b/var/spack/stage/spac
           k-stage-dftbplus-19.1-uftonhmduenlibag2hyy44okp2sr2njk/spack-src/ext
           ernal/scalapackfx/make.dpbuild:21: libscalapackfx] Error 2
     58    make[1]: Leaving directory '/home/mkandes/.spack/0.17.3/cpu/b/var/sp
           ack/stage/spack-stage-dftbplus-19.1-uftonhmduenlibag2hyy44okp2sr2njk
           /spack-src/_build/external/scalapackfx'
     59    make: *** [makefile:120: external_scalapackfx] Error 2
     60    make: *** Waiting for unfinished jobs....
     61    make[2]: Entering directory '/home/mkandes/.spack/0.17.3/cpu/b/var/s
           pack/stage/spack-stage-dftbplus-19.1-uftonhmduenlibag2hyy44okp2sr2nj
           k/spack-src/_build/external/mpifx'
  >> 62    make[2]: /home/mkandes/.spack/0.17.3/cpu/b/var/spack/stage/spack-sta
           ge-dftbplus-19.1-uftonhmduenlibag2hyy44okp2sr2njk/spack-src/external
           /mpifx/origin/src/Makefile.lib: No such file or directory
  >> 63    make[2]: *** No rule to make target '/home/mkandes/.spack/0.17.3/cpu
           /b/var/spack/stage/spack-stage-dftbplus-19.1-uftonhmduenlibag2hyy44o
           kp2sr2njk/spack-src/external/mpifx/origin/src/Makefile.lib'.  Stop.
     64    make[2]: Leaving directory '/home/mkandes/.spack/0.17.3/cpu/b/var/sp
           ack/stage/spack-stage-dftbplus-19.1-uftonhmduenlibag2hyy44okp2sr2njk
           /spack-src/_build/external/mpifx'
  >> 65    make[1]: *** [/home/mkandes/.spack/0.17.3/cpu/b/var/spack/stage/spac
           k-stage-dftbplus-19.1-uftonhmduenlibag2hyy44okp2sr2njk/spack-src/ext
           ernal/mpifx/make.dpbuild:21: libmpifx] Error 2
     66    make[1]: Leaving directory '/home/mkandes/.spack/0.17.3/cpu/b/var/sp
           ack/stage/spack-stage-dftbplus-19.1-uftonhmduenlibag2hyy44okp2sr2njk
           /spack-src/_build/external/mpifx'
     67    make: *** [makefile:120: external_mpifx] Error 2
     68    /cm/shared/apps/spack/0.17.3/cpu/b/opt/spack/linux-rocky8-zen2/gcc-1
           0.2.0/openmpi-4.1.3-oq3qvsvt5mywjzy7xzrfeh6eebiujvbm/bin/mpif90 -O2 
           -funroll-all-loops -fopenmp -fall-intrinsics -o converters.o -c /hom
           e/mkandes/.spack/0.17.3/cpu/b/var/spack/stage/spack-stage-dftbplus-1
           9.1-uftonhmduenlibag2hyy44okp2sr2njk/spack-src/external/xmlf90/conve
           rters.f90
     69    /cm/shared/apps/spack/0.17.3/cpu/b/opt/spack/linux-rocky8-zen2/gcc-1
           0.2.0/openmpi-4.1.3-oq3qvsvt5mywjzy7xzrfeh6eebiujvbm/bin/mpif90 -O2 
           -funroll-all-loops -fopenmp -fall-intrinsics -o wxml_core.o -c /home
           /mkandes/.spack/0.17.3/cpu/b/var/spack/stage/spack-stage-dftbplus-19
           .1-uftonhmduenlibag2hyy44okp2sr2njk/spack-src/external/xmlf90/wxml_c
           ore.f90
     70    /cm/shared/apps/spack/0.17.3/cpu/b/opt/spack/linux-rocky8-zen2/gcc-1
           0.2.0/openmpi-4.1.3-oq3qvsvt5mywjzy7xzrfeh6eebiujvbm/bin/mpif90 -O2 
           -funroll-all-loops -fopenmp -fall-intrinsics -o reader.o -c /home/mk
           andes/.spack/0.17.3/cpu/b/var/spack/stage/spack-stage-dftbplus-19.1-
           uftonhmduenlibag2hyy44okp2sr2njk/spack-src/external/xmlf90/reader.f9
           0
     71    /cm/shared/apps/spack/0.17.3/cpu/b/opt/spack/linux-rocky8-zen2/gcc-1
           0.2.0/openmpi-4.1.3-oq3qvsvt5mywjzy7xzrfeh6eebiujvbm/bin/mpif90 -O2 
           -funroll-all-loops -fopenmp -fall-intrinsics -o dictionary.o -c /hom
           e/mkandes/.spack/0.17.3/cpu/b/var/spack/stage/spack-stage-dftbplus-1
           9.1-uftonhmduenlibag2hyy44okp2sr2njk/spack-src/external/xmlf90/dicti
           onary.f90

See build log for details:
  /home/mkandes/.spack/0.17.3/cpu/b/var/spack/stage/spack-stage-dftbplus-19.1-uftonhmduenlibag2hyy44okp2sr2njk/spack-build-out.txt

==> Error: Terminating after first install failure: ProcessError: Command exited with status 2:
    'make' '-j16'
real 169.81
user 25.20
sys 5.29
ERROR: spack install failed.
[mkandes@login02 ~]$

@mkandes
Copy link
Member Author

mkandes commented Aug 13, 2024

Is there anything we're missing when deploying multiple shared instances within a user's HOME directory?

[mkandes_test@login02]~% source activate-shared-spack-instance.sh 
[mkandes_test@login02]~% module avail

---------------------------------- /home/mkandes_test/.spack/0.17.3/gpu/b/share/spack/lmod/linux-rocky8-x86_64 ----------------------------------
   Core/cmake/3.21.4/ooshpqx    nvhpc/21.9/hdf5/1.10.7/g4km7fa    nvhpc/21.9/netcdf-c/4.8.1/3x7nkr7    nvhpc/21.9/netcdf-fortran/4.5.3/vuvf7ah

--------------------------------- /cm/shared/apps/spack/0.17.3/cpu/b/share/spack/lmod/linux-rocky8-x86_64/Core ----------------------------------
   anaconda3/2021.05/q4munrg             gcc/10.2.0/npcyll4        intel/19.1.3.304/6pv46so     pigz/2.6/bgymyil             ucx/1.10.1/wla3unl
   aocc/3.2.0/io3s466                    gh/2.0.0/mkz3uxl          matlab/2022b/lefe4oq         rclone/1.56.2/mldjorr
   aria2/1.35.0/q32jtg2                  git-lfs/2.11.0/kmruniy    mercurial/5.8/qmgrjvl        sratoolkit/2.10.9/rn4humf
   entrezdirect/10.7.20190114/6pkkpx2    git/2.31.1/ldetm5y        parallel/20210922/sqru6rr    subversion/1.14.0/qpzq6zs

------------------------------------------------------------- /cm/local/modulefiles -------------------------------------------------------------
   shared (L)    singularitypro/3.11 (D)    singularitypro/4.1.2    slurm/expanse/23.02.7 (L)

------------------------------------------------------ /cm/shared/apps/access/modulefiles -------------------------------------------------------
   accessusage/0.5-1    cue-login-env

------------------------------------------------------------ /usr/share/modulefiles -------------------------------------------------------------
   DefaultModules (L)    cpu/0.17.3b (c,L,D)    gpu/0.17.3b    (g,D)    nostack/0.17.3b (e,D)
   cpu/0.15.4     (c)    gpu/0.15.4  (g)        nostack/0.15.4 (e)

------------------------------------------------------------ /cm/shared/modulefiles -------------------------------------------------------------
   AMDuProf/3.4.475    default-environment    sdsc/1.0 (L)    slurm/expanse/current    slurm/expanse/23.02.7 (D)

  Where:
   L:  Module is loaded
   c:  built natively for AMD Rome
   e:  not architecture specific
   g:  built natively for Intel Skylake
   D:  Default Module

Module defaults are chosen based on Find First Rules due to Name/Version/Version modules found in the module tree.
See https://lmod.readthedocs.io/en/latest/060_locating.html for details.

Use "module spider" to find all possible modules and extensions.
Use "module keyword key1 key2 ..." to search for all possible modules matching any of the "keys".


[mkandes_test@login02]~%

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants
@mkandes and others