Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

UFS/dev PR#216 #527

Merged
merged 28 commits into from
Oct 21, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
28 commits
Select commit Hold shift + click to select a range
6792ae3
point to develop branch of ccpp-framework
grantfirl Aug 20, 2024
1a3bb72
fix chap_quick.rst that was not merged correctly from previous PR
grantfirl Aug 27, 2024
1d2f8c2
add info regarding the save_comp option in UFS_case_gen.py
grantfirl Aug 28, 2024
6ebd581
update chap_cases.rst
grantfirl Aug 28, 2024
a302767
add optional arguments for UFS_case_gen examples
grantfirl Aug 28, 2024
2ef6259
fix formatting error
grantfirl Aug 28, 2024
1e402c7
fix formatting error
grantfirl Aug 28, 2024
923d032
update ccpp/physics submodule pointer
grantfirl Sep 5, 2024
75f63b8
Merge branch 'release/public-v7' into update_ug_gjf
grantfirl Sep 5, 2024
02d490a
Add yaml script for python environment used in online tutorial.
dustinswales Sep 5, 2024
361ab11
Renamed to match online tutorial
dustinswales Sep 5, 2024
3197aee
Merge pull request #507 from grantfirl/update_ug_gjf
grantfirl Sep 5, 2024
999ab88
update twpice_all_suites.ini to use supported suites for v7
grantfirl Sep 5, 2024
c85a9af
Merge pull request #509 from dustinswales/hotfix/py_env_for_tutorial
grantfirl Sep 5, 2024
5f747c9
Adding documentation on run_scm.py --mpi_command argument
scrasmussen Sep 6, 2024
b044346
Fix to Derecho script so it will run run_scm.py in the directory the …
scrasmussen Sep 6, 2024
8cb6603
additions to tutorial yaml
Sep 7, 2024
3119089
change conversion_factors
grantfirl Sep 9, 2024
b245602
Merge pull request #510 from grantfirl/tut_fix_20240905
grantfirl Sep 9, 2024
d31b8f7
Merge pull request #512 from scrasmussen/release/public-v7-mpi_comman…
grantfirl Sep 9, 2024
bff5e62
Merge pull request #513 from hertneky/yaml_fix
grantfirl Sep 9, 2024
5542079
update contrib scripts to grab v7.0.0 release assets
grantfirl Sep 9, 2024
d939f51
Update Dockerfile to use release branch
mkavulich Sep 9, 2024
fa957f1
Merge branch 'release/public-v7' into ufs-dev-PR216
grantfirl Oct 10, 2024
ed58eba
use main branch of ccpp-framework
grantfirl Oct 10, 2024
7f0ff67
use PR branch of ccpp/physics
grantfirl Oct 10, 2024
ff1737f
add option in RT comparison script to turn off plots (only report dif…
grantfirl Oct 15, 2024
9352a5c
update ccpp/physics after merge
grantfirl Oct 21, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion ccpp/physics
Submodule physics updated 173 files
2 changes: 1 addition & 1 deletion contrib/get_aerosol_climo.sh
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ data_files=("FV3_aeroclim1" "FV3_aeroclim2" "FV3_aeroclim3" "FV3_aeroclim_optics
cd $BASEDIR/scm/data/physics_input_data/
for file in "${data_files[@]}"; do
echo "Retrieving $file.tar.gz"
wget https://github.com/NCAR/ccpp-scm/releases/download/v7.0.0-beta/${file}.tar.gz
wget https://github.com/NCAR/ccpp-scm/releases/download/v7.0.0/${file}.tar.gz
tar -xvf ${file}.tar.gz
rm -f ${file}.tar.gz
done
Expand Down
2 changes: 1 addition & 1 deletion contrib/get_all_static_data.sh
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ for file in "${data_files[@]}"; do
mkdir -p $BASEDIR/scm/data/$file
cd $BASEDIR/scm/data/$file
echo "Retrieving $file"
wget https://github.com/NCAR/ccpp-scm/releases/download/v7.0.0-beta/${file}.tar.gz
wget https://github.com/NCAR/ccpp-scm/releases/download/v7.0.0/${file}.tar.gz
tar -xf ${file}.tar.gz
rm -f ${file}.tar.gz
done
Expand Down
2 changes: 1 addition & 1 deletion contrib/get_mg_inccn_data.sh
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ BASEDIR=$MYDIR/..

# Change to directory containing the physics input data, download and extract archive
cd $BASEDIR/scm/data/physics_input_data/
wget https://github.com/NCAR/ccpp-scm/releases/download/v7.0.0-beta/MG_INCCN_data.tar.gz
wget https://github.com/NCAR/ccpp-scm/releases/download/v7.0.0/MG_INCCN_data.tar.gz
tar -xvf MG_INCCN_data.tar.gz
rm -f MG_INCCN_data.tar.gz
cd $BASEDIR/
Expand Down
2 changes: 1 addition & 1 deletion contrib/get_thompson_tables.sh
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ BASEDIR=$MYDIR/..

# Change to directory containing the physics input data, download and extract archive
cd $BASEDIR/scm/data/physics_input_data/
wget https://github.com/NCAR/ccpp-scm/releases/download/v7.0.0-beta/thompson_tables.tar.gz
wget https://github.com/NCAR/ccpp-scm/releases/download/v7.0.0/thompson_tables.tar.gz
tar -xvf thompson_tables.tar.gz
rm -f thompson_tables.tar.gz
cd $BASEDIR/
Expand Down
2 changes: 1 addition & 1 deletion docker/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -74,7 +74,7 @@ ENV w3emc_ROOT=/comsoftware/nceplibs
# Obtain CCPP SCM source code, build code, and download static data
RUN if [ -z "$PR_NUMBER" ]; then \
cd /comsoftware \
&& git clone --recursive -b main https://github.com/NCAR/ccpp-scm; \
&& git clone --recursive -b release/public-v7 https://github.com/NCAR/ccpp-scm; \
else \
cd /comsoftware \
&& git clone https://github.com/NCAR/ccpp-scm \
Expand Down
9 changes: 9 additions & 0 deletions environment-scm_analysis.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
name: env_scm_analysis

dependencies:
- conda-forge::python=3.7
- conda-forge::netcdf4
- conda-forge::f90nml
- conda-forge::configobj
- conda-forge::matplotlib
- conda-forge::pandas
36 changes: 31 additions & 5 deletions scm/doc/TechGuide/chap_cases.rst
Original file line number Diff line number Diff line change
Expand Up @@ -297,6 +297,20 @@ Activate environment:
.. code:: bash

> conda activate env_ufscasegen

Note that it may be possible for conda to fail to solve for the environment when attempting to use the yml file. It
should still be possible to create the same environment manually:

.. code:: bash

> conda create --name env_ufscasegen
> conda install -n env_ufscasegen --channel=conda-forge python=3.8.5
> conda install -n env_ufscasegen --channel=conda-forge netcdf4
> conda install -n env_ufscasegen --channel=conda-forge f90nml
> conda install -n env_ufscasegen --channel=conda-forge xarray
> conda install -n env_ufscasegen --channel=conda-forge numpy
> conda install -n env_ufscasegen --channel=conda-forge shapely
> conda install -n env_ufscasegen --channel=conda-forge xesmf

.. _`ufscasegen`:

Expand Down Expand Up @@ -422,13 +436,24 @@ appreciably different than the calculated geostrophic winds), this often leads t
with time. An option exists within the script to assume that the mean three-dimensional winds are, in fact, identical to the
geostrophic winds as well. Using this option eliminates any spurious turning.

Writing UFS Comparison Data

The `--save_comp` (or `-sc`) options allow one to write out the UFS data for the chosen column in NetCDF format. The profiles of the state variables
`u`, `v`, `T`, and `q_v` are written out for the given point for each history file time. In addition, a collection of other
diagnostics like profiles of physics tendencies and scalar surface variables are saved and written. One can include any variable that is
provided in the UFS history files, although the specific variables are hard-coded in the `UFS_case_gen.py` file which will require editing
to change. The file with comparison data is automatically written out to the `scm/data/comparison_data` directory, although this is controlled
by the `COMPARISON_DATA_DIR` global variable in the `UFS_case_gen.py` script. The filename is a concatenation of the case name (specified by the
`--case_name (-n)` argument) and `_comp_data.nc`.

.. _`ufsforcingensemblegenerator`:

UFS_forcing_ensemble_generator.py
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

There is an additional script in ``scm/etc/scripts/UFS_forcing_ensemble_generator.py`` to create UFS-caseGen case(s) starting
with output from UFS Weather Model (UWM) Regression Tests (RTs).
with output from UFS Weather Model (UWM) Regression Tests (RTs). This script provides a wrapper for ``UFS_case_gen.py`` for
generating multiple cases at once.

.. code:: bash

Expand Down Expand Up @@ -489,11 +514,12 @@ staged UWM RTs located at:
Example 1: UFS-caseGen for single point
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

UFS regression test, ``control_c192``, for single point.
UFS regression test, ``control_c192``, for a single point using calculated horizontal advective tendencies,
supplying the vertical velocity for the vertical advective terms and nudging the horizontal winds:

.. code:: bash

./UFS_forcing_ensemble_generator.py -d [path_to_regression_tests_output]/control_c192_intel/ -sc --C_RES 192 -dt 360 -n control_c192 -lons 300 -lats 34
./UFS_forcing_ensemble_generator.py -d [path_to_regression_tests_output]/control_c192_intel/ -sc --C_RES 192 -dt 360 -n control_c192 -lons 300 -lats 34 -fm 2 -vm 2 -wn

Upon successful completion of the script, the command to run the case(s)
will print to the screen. For example,
Expand All @@ -509,11 +535,11 @@ The file ``scm_ufsens_control_c192.py`` is created in ``ccpp-scm/scm/bin/``, whe
Example 2: UFS-caseGen for list of points
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

UFS regression test, ``control_c384``, for multiple points.
UFS regression test, ``control_c384``, for multiple points assuming the same forcing method as above:

.. code:: bash

./UFS_forcing_ensemble_generator.py -d /glade/derecho/scratch/epicufsrt/ufs-weather-model/RT/NEMSfv3gfs/develop-20240607/control_c384_intel/ -sc --C_RES 384 -dt 225 -n control_c384 -lons 300 300 300 300 -lats 34 35 35 37
./UFS_forcing_ensemble_generator.py -d /glade/derecho/scratch/epicufsrt/ufs-weather-model/RT/NEMSfv3gfs/develop-20240607/control_c384_intel/ -sc --C_RES 384 -dt 225 -n control_c384 -lons 300 300 300 300 -lats 34 35 35 37 -fm 2 -vm 2 -wn

Upon successful completion of the script, the command to run the case(s)
will print to the screen. For example,
Expand Down
20 changes: 7 additions & 13 deletions scm/doc/TechGuide/chap_quick.rst
Original file line number Diff line number Diff line change
Expand Up @@ -216,13 +216,6 @@ Unified Forecast System and related applications, only a minority of which are r
install libraries manually if they wish, but they will need to make sure the appropriate environment variables
are set to the correct values so that the build system can find them, as described in the following paragraphs.


<<<<<<< HEAD
Setting up compilation environment
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

For users on a pre-configured platform, the spack-stack environment can be loaded via one of the provided modules in ``scm/etc/modules/`` as described in :numref:`Section %s <use_preconfigured_platforms>`.
=======
For users on a pre-configured platform, you can load the spack-stack environment via one of the provided modules in ``scm/etc/modules/``.
For example, users on the NSF NCAR machine Derecho who wish to use Intel compilers can do the following:

Expand All @@ -231,7 +224,6 @@ For example, users on the NSF NCAR machine Derecho who wish to use Intel compile
cd [path/to/ccpp-scm/]
module use scm/etc/modules/
module load derecho_intel
>>>>>>> feature/modulefile_updates

Additionally, for users who have installed spack-stack on their own MacOS or Linux machine can use the provided ``macos_clang``
or ``linux_gnu`` modules.
Expand Down Expand Up @@ -602,6 +594,12 @@ If using the main branch, you should run the above command to ensure you have th
- Use this to specify the timestep to use (if different than the
default specified in ``../src/suite_info.py``).

- ``--mpi_command``

- Provide argument to define the MPI command that will be invoked.
Default MPI command is ``mpirun -np 1``.
(Note: to run on a Derecho login node the empty argument ``--mpi_command ''`` is required.)

- ``--verbose [-v]``

- Use this option to see additional debugging output from the run
Expand All @@ -613,11 +611,7 @@ configuration files located in ``../etc/case_config`` (*without the .nml extensi
specifying a suite other than the default, the suite name used must
match the value of the suite name in one of the suite definition files
located in ``../../ccpp/suites`` (Note: not the filename of the suite definition file). As
<<<<<<< HEAD
part of the CCPP SCM v7.0.0 release, the following suite names are supported:
=======
part of the seventh CCPP release, the following suite names are supported:
>>>>>>> feature/modulefile_updates

#. SCM_GFS_v16

Expand Down Expand Up @@ -888,7 +882,7 @@ Running the Docker image

#. To run the SCM, you can run the Docker container that was just
created and give it the same run commands as discussed in :numref:`Section %s <singlerunscript>`
**Be sure to remember to include the ``-d`` and ``--mpi_command "mpirun -np 1 --allow-run-as-root"``
**Be sure to remember to include the ``-d`` and ``--mpi_command "mpirun -np 1 --allow-run-as-root"``
options for all run commands**. For example,

.. code:: bash
Expand Down
2 changes: 1 addition & 1 deletion scm/etc/scm_qsub_example.py
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@
WALLTIME = "walltime=00:20:00"
PROCESSORS = "select=1:ncpus=1"
QUEUE = "develop"
COMMAND = "./run_scm.py -c twpice"
COMMAND = "cd $PBS_O_WORKDIR; ./run_scm.py -c twpice"
EMAIL_ADDR = MY_EMAIL
SERIAL_MEM = "512M"
WORKING_DIR = os.path.dirname(os.path.abspath(__file__))
Expand Down
13 changes: 8 additions & 5 deletions scm/etc/scripts/plot_configs/twpice_all_suites.ini
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
scm_datasets = output_twpice_SCM_GFS_v16/output.nc, output_twpice_SCM_GFS_v17_p8/output.nc, output_twpice_SCM_RAP/output.nc, output_twpice_SCM_RRFS_v1beta/output.nc, output_twpice_SCM_WoFS_v0/output.nc, output_twpice_SCM_HRRR/output.nc
scm_datasets_labels = GFSv16, GFSv17p8, RAP, RRFSv1b, WoFSv0, HRRR
scm_datasets = output_twpice_SCM_GFS_v16/output.nc, output_twpice_SCM_GFS_v16_RRTMGP/output.nc, output_twpice_SCM_GFS_v17_p8_ugwpv1/output.nc, output_twpice_SCM_WoFS_v0/output.nc, output_twpice_SCM_HRRR_gf/output.nc
scm_datasets_labels = GFSv16, GFSv16-GP, GFSv17p8-ugwpv1, WoFSv0, HRRR-gf
plot_dir = plots_twpice_all_suites/
obs_file = ../data/raw_case_input/twp180iopsndgvarana_v2.1_C3.c1.20060117.000000.cdf
obs_compare = True
Expand All @@ -23,24 +23,27 @@ time_series_resample = True
y_log = False
y_min_option = min #min, max, val (if val, add y_min = float value)
y_max_option = max #min, max, val (if val, add y_max = float value)
conversion_factor = 1000.0, 1000.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0
conversion_factor = 1000.0, 1000.0, 1.0, 86400.0, 86400.0, 86400.0, 86400.0, 86400.0

[[profiles_mean_multi]]
[[[T_forcing]]]
vars = T_force_tend, dT_dt_pbl, dT_dt_conv, dT_dt_micro, dT_dt_lwrad, dT_dt_swrad
vars_labels = 'force', 'PBL', 'Conv', 'MP', 'LW', 'SW'
x_label = 'K/day'
conversion_factor = 86400.0
[[[conv_tendencies]]]
vars = dT_dt_deepconv, dT_dt_shalconv
vars_labels = 'deep', 'shallow'
x_label = 'K/day'
conversion_factor = 86400.0

[[profiles_instant]]

[[time_series]]
vars = 'pres_s','lhf','shf','tprcp_rate_inst'
vars = 'pres_s','lhf','shf','tprcp_rate_accum'
vars_labels = 'surface pressure (Pa)','latent heat flux ($W$ $m^{-2}$)','sensible heat flux ($W$ $m^{-2}$)','surface rainfall rate ($mm$ $hr{-1}$)'

conversion_factor = 1.0, 1.0, 1.0, 3600000.0

[[contours]]
vars = qv,
vars_labels = 'Water Vapor ($g$ $kg^{-1}$)',
Expand Down
20 changes: 13 additions & 7 deletions test/cmp_rt2bl.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,20 +14,22 @@

#
parser = argparse.ArgumentParser()
parser.add_argument('-drt', '--dir_rt', help='Directory containing SCM RT output', required=True)
parser.add_argument('-dbl', '--dir_bl', help='Directory containing SCM RT baselines', required=True)
parser.add_argument('-drt', '--dir_rt', help='Directory containing SCM RT output', required=True)
parser.add_argument('-dbl', '--dir_bl', help='Directory containing SCM RT baselines', required=True)
parser.add_argument('-np', '--no_plots', help='flag to turn off generation of difference plots', required=False, action='store_true')

#
def parse_args():
args = parser.parse_args()
dir_rt = args.dir_rt
dir_bl = args.dir_bl
return (dir_rt, dir_bl)
no_plots = args.no_plots
return (dir_rt, dir_bl, no_plots)

#
def main():
#
(dir_rt, dir_bl) = parse_args()
(dir_rt, dir_bl, no_plots) = parse_args()

#
error_count = 0
Expand All @@ -38,14 +40,17 @@ def main():
com = "cmp "+file_rt+" "+file_bl+" > logfile.txt"
result = os.system(com)
if (result != 0):
print("Output for "+run["case"]+"_"+run["suite"]+ " DIFFERS from baseline. Difference plots will be created")
message = "Output for "+run["case"]+"_"+run["suite"]+ " DIFFERS from baseline."
if (not no_plots):
message += " Difference plots will be created."
print(message)
error_count = error_count + 1
else:
print("Output for "+run["case"]+"_"+run["suite"]+ " is IDENTICAL to baseline")
# end if

# Create plots between RTs and baselines (only if differences exist)
if (result != 0):
if (result != 0 and not no_plots):
plot_files = plot_results(file_bl, file_rt)

# Setup output directories for plots.
Expand All @@ -71,7 +76,8 @@ def main():
# end for

# Create tarball with plots.
result = os.system('tar -cvf scm_rt_out.tar scm_rt_out/*')
if (not no_plots):
result = os.system('tar -cvf scm_rt_out.tar scm_rt_out/*')

#
if error_count == 0:
Expand Down
Loading