Skip to content

ELM on Mac OSX: environments, building, and running

Fengming Yuan edited this page Jul 7, 2021 · 1 revision

E3SM Land Model (ELM) on MacOS/Linux: Environments, Building, and Running

**** NOTE: Users will have to install their own GCC compilers, MPICH(or OPENMPI)(https://github.com/fmyuan/E3SM/wiki/GCC-Installation-on-Mac), and HDF5 and NETCDF(https://github.com/fmyuan/E3SM/wiki/HDF5-and-NETCDF4-Installation-on-Mac), etc. AND manually modify relevant directories below. ****

I. Environments

I-1. Compilers

The following are assumed:

GCC_ROOT /usr/local/gcc-5.3.0
MPI_ROOT /usr/local/mpich-3.2-gcc53

CMake is required:

/usr/local/CMake.App/Contents/bin/cmake

GMAKE is needed as following, but can use 'make' alternatively:

/usr/local/bin/gmake

I-2. NETCDF (and HDF5): for C and FORTRAN.

  • NETCDF4
/usr/local/netcdf-4.4-mpich32-gcc53
  • HDF5
/usr/local/hdf5-1.8.16-mpich32-gcc53

I-3. ELM codes & Input data

$HOME/mygithub/E3SM

This is fully-cloned from this site. The code almost same as that on github.com, but with modification for working on mac. So normally go to this model directory to create a new case, with '-mach mymac ' option (see below).

E3SM machine files for mac (DEMO)

Adding the following into config_machines.xml


<machine MACH="mymac">
    <DESC>Mac OS/X workstation or laptop</DESC>
    <NODENAME_REGEX></NODENAME_REGEX>
    <TESTS>acme_user</TESTS>
    <OS>Darwin</OS>
    <COMPILERS>gnu</COMPILERS>
    <MPILIBS>mpich,mpi-serial</MPILIBS>
    <RUNDIR>$ENV{HOME}/project_acme/scratch/$CASE/run</RUNDIR>
    <EXEROOT>$ENV{HOME}/project_acme/scratch/$CASE/bld</EXEROOT>
    <DIN_LOC_ROOT>$ENV{HOME}/project_acme/cesm-inputdata</DIN_LOC_ROOT>    
    <DIN_LOC_ROOT_CLMFORC>$ENV{HOME}/project_acme/cesm-inputdata/atm/datm7</DIN_LOC_ROOT_CLMFORC>
    <DOUT_S_ROOT>$ENV{HOME}/project_acme/scratch/archive/$CASE</DOUT_S_ROOT>
    <DOUT_L_MSROOT>csm/$CASE</DOUT_L_MSROOT>
    <CESMSCRATCHROOT>$ENV{HOME}/project_acme/scratch</CESMSCRATCHROOT>
    <CCSM_BASELINE>$ENV{HOME}/project_acme/baselines</CCSM_BASELINE>
    <CCSM_CPRNC>$CCSMROOT/tools/cprnc/build/cprnc</CCSM_CPRNC>
    <SUPPORTED_BY>YOUR_NAME</SUPPORTED_BY>
    <GMAKE>gmake</GMAKE>
    <GMAKE_J>1</GMAKE_J>
    <MAX_TASKS_PER_NODE>1</MAX_TASKS_PER_NODE>
    <MAX_MPITASKS_PER_NODE>1</MAX_MPITASKS_PER_NODE>
    <PES_PER_NODE>1</PES_PER_NODE>
    <BATCH_SYSTEM>none</BATCH_SYSTEM>
    <BATCHQUERY></BATCHQUERY>
    <BATCHSUBMIT></BATCHSUBMIT>
    <mpirun mpilib="default">
      <executable compiler="gnu">/usr/local/mpich-3.2-gcc53/bin/mpirun</executable>
      <arguments>
        <arg name="num_tasks"> -np {{ num_tasks }}</arg>
      </arguments>
    </mpirun>
    <module_system type="none"></module_system>   
    <environment_variables>
      <env name="PATH">/usr/local/CMake.App/Contents/bin/:/usr/local/bin/:$PATH</env>   <!--This is for Cmake, needed by PIO -->
      <env name="GCC_ROOT" compiler="gnu">/usr/local/gcc-5.3.0</env>
      <env name="MPI_ROOT" compiler="gnu">/usr/local/mpich-3.2-gcc53</env>
      <env name="HDF5_ROOT" compiler="gnu">/usr/local/hdf5-1.8.16-mpich32-gcc53</env>
      <env name="NETCDF_ROOT" compiler="gnu">/usr/local/netcdf-4.4-mpich32-gcc53</env>
      <env name="PETSC_DIR" compiler="gnu">/usr/local/petsc-mpich32-gcc53</env>      
      <env name="PFLOTRAN_COUPLED_MODEL"></env>
    </environment_variables>       
</machine>

ADDing the following into config_compilers.xml:

<compiler COMPILER="gnu" MACH="mymac">
  <NETCDF_PATH> $(NETCDF_ROOT)</NETCDF_PATH>
  <ADD_LDFLAGS compile_threaded="true"> -L $(GCC_ROOT)/lib/gcc/x86_64-apple-darwin15.4.0/5.3.0 -fopenmp </ADD_LDFLAGS>
  <ADD_LDFLAGS>-framework Accelerate</ADD_LDFLAGS>
  <ADD_SLIBS>$(shell $(NETCDF_PATH)/bin/nf-config --flibs) -framework Accelerate</ADD_SLIBS>
  <SFC>$(GCC_ROOT)/bin/gfortran</SFC>
  <SCC>$(GCC_ROOT)/bin/gcc</SCC>
  <SCXX>$(GCC_ROOT)/bin/g++</SCXX>
  <MPICC>$(MPI_ROOT)/bin/mpicc</MPICC>
  <MPICXX>$(MPI_ROOT)/bin/mpicxx</MPICXX>
  <MPIFC>$(MPI_ROOT)/bin/mpif90</MPIFC>
  <!-- hacking of mach/compiler generated 'Macros' for using PETSC with pflotran -->
  <!-- ideally it should go with CLM configuration, but it's very general if machine settings are properly done -->
  <ADD_FFLAGS PFLOTRAN="TRUE" MODEL="clm"> -I$(PETSC_DIR)/include -I$(PETSC_DIR)/$(PETSC_ARCH)/include -I$(PFLOTRAN_COUPLED_MODEL)/src/clm-pflotran </ADD_FFLAGS>
  <ADD_CPPDEFS PFLOTRAN="TRUE" MODEL="clm"> -DCLM_PFLOTRAN </ADD_CPPDEFS>
  <ADD_CPPDEFS PFLOTRAN="TRUE" MODEL="clm" COLUMN_MODE="TRUE"> -DCOLUMN_MODE </ADD_CPPDEFS>
  <ADD_LDFLAGS PFLOTRAN="TRUE" MODEL="driver"> -L$(PFLOTRAN_COUPLED_MODEL)/src/clm-pflotran -lpflotran $(PETSC_LIB)</ADD_LDFLAGS>
  <!-- end of hacking 'Macros' for using PETSC with pflotran -->
</compiler>

INPUT DATA

$HOME/project_acme/cesm-inputdata/

User will have to obtain a full copy of input data for ONE point simulation, at least.