-
Notifications
You must be signed in to change notification settings - Fork 250
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Ultra low resolution configuration for testing #2508
Comments
@danholdaway - Please let me know if ultra-low wave configurations are also required and I'm happy to help provide those parts. (Also fine to turn off waves for various tests too). |
Thank you @JessicaMeixner-NOAA I think it would be useful to include waves in this effort. We aren't currently testing with waves turned on but we should start to include that. |
@danholdaway Our team will look into atm/ocn/ice components for the test you requested. May I ask some background questions:
|
Thanks for looking into this @junwang-noaa, we very much appreciate the help.
|
@danholdaway , we do have a 10deg grid for mom6. Something would need to be done for cice6 as well. |
We create the CICE6 fix files from the MOM6 supergrid and mask. We'd need a MOM_input consistent w/ a 10deg MOM6 physics. |
@danholdaway Thanks for the information. |
Let me see what we have. We did that work ages ago and I don't remember if we went beyond being able to use it within soca/jedi. |
@DeniseWorthen , @junwang-noaa , here's link to the jcsda soca repo with hopefully all the needed bits and pieces: |
To utilized the cpld_gridgen utility, we need a ocean_hgrid.nc file, a topography file and a mask file for MOM6. The grid_spec file you point to seems to already be at the target resolution (17x36x35). I see that MOM_input is asking for the super grid file |
yes, sorry, I thought these were in one of the sub-dir but apparently not. I'll get back to you when I find these files. |
@DeniseWorthen , the files are on MSU:
do you need the MOM.res.nc restart as well? |
@yangfanglin do you have a physics package for the ultra low resolutions (~800km/400km) that we can use to set up the C12/C24 with 32 vertical level tests? Thanks |
I don't need restarts to generate the fix files. Are you setting the land mask = 0 where depth =0 (unless you have a mask file somewhere). |
Note: UFS_UTILS requires some slight updates to create low-resolution grids. This is being worked here: |
@guillaumevernieres I'm not making sense of the |
Establishing a 10-deg ocean resolution is going to play havoc w/ the naming convention and have knock-on impacts from fix file generation down through the regression test scripts and inputs etc. This is because currently we use a 3-character string to denote the ocean/ice resolution. For example, mx050 is 1/2 deg, mx100 is one deg etc. Creating a 10-deg resolution will require a 4 character string--so the 1/2 deg will need to be mx0050 and 5-deg will be mx0500 and 10-deg will be mx1000. I wonder if a 9-deg ocean/ice resolution would be a better idea. That would be a 40x20 grid (vs 10deg = 36x17) but would avoid the issue with file naming and file regeneration, RT modifications etc. |
Thanks for reporting this @DeniseWorthen. Probably not worth changing the entire naming convention for this and I think 9 degree would suffice. |
I've been able to create a 9-deg configuration for ocean and produce the associated CICE grid files using the ufs-utils/cpld_gridgen utility. I'll also document this in the ufs-utils PR I'll open. I have not tried to run this yet, but I'll try w/ a DATM-OCN-ICE configuration next. The ufs-utils repo has some of the require fre-nctools available to build, but not the make_topog tool, which is also required. I found the tools were installed system-wide on Gaea-C5, so I was able to use those to do the following:
|
Great progress, thank you @DeniseWorthen |
Well, by golly, it ran! I completed 24 hours using the DATM coupled to the 9deg MOM6/CICE6. /work2/noaa/stmp/dworthen/CPLD_GRIDGEN/rt_349784/datm.mx900 |
You rock @DeniseWorthen 🎉 🎉 Thanks for doing this! |
The next step is to generate the mapped ocean masks for the new ATM resolutions. These are used by chgres to create the oro-data and ICs. It sounds like you definitely want C12+9deg and C18/24+5deg, but there's some question of what ATM configuration (# levels, physics) will work? George has created the required low-res C-grids, and I've created the mapped ocean masks (eg.
It seems like the best next step is to try to create the atm ICs and see if any of them run? |
Yeah, my plan is to try using global_hyblev.l28.txt for vertical levels to create ATM ICs from existing chres_cube regression test input GFS data once we have C12.mx900 orography files. It looks like George was able to create C12mx100 files on Hera ( |
OK, thanks. After generating the mapped ocean masks, I'm not well-versed on the process so I won't be much help. |
Yes. I built them with the coupled grid files, so they should work.
These are for the GWD scheme, so they should be unnecessary. |
@LarissaReames-NOAA Great, that is what I thought, but wanted to ask... |
I have an initial cpld case which starts up, but then dies at
so I know I haven't gotten the My branch is https://github.com/DeniseWorthen/ufs-weather-model/tree/feature/ultralow if anyone wants to take a stab at exporting all the right values in the |
@LarissaReames-NOAA I've updated my ultralow feature branch to the top of develop. This contains a test ( The inputdata is staged in a special directory on Hera ( |
Thanks for getting the ICs staged. It looks like the mx900 CICE ICs are missing? I think they should be in /scratch2/NCEPDEV/stmp3/Denise.Worthen/input-data-20240501/CICE_IC/900/2021032206/ but the only thing in there is another directory, '900', with invalid.nc in there. Is the RT workflow pointing to the wrong place for your directory structure? |
@LarissaReames-NOAA I don't have an IC to use for CICE, so I just put a random file there for now, so that the RT scripts would work and we could get it running. In the cpld_control_c12, I set |
Okay, I have a working coupled C12 test. I pushed the test (cpld_control_c12) and the settings changes needed to my branch feature/lowres. There's also a lowres_rt.conf to run the test. I created a new namelist based on the namelist I used in the original ATM tests to make this work. |
I'd like to try testing out the C24mx500 configuration now that C12mx900 is working. When you're available, @DeniseWorthen can you create those input directories in |
@LarissaReames-NOAA It should just be a matter of updating the scripting, which currently pairs the 5-deg only w/ C48. There are currently 3 tests for the C48mx500 configuration. Are you thinking that all three w/ be replaced w/ the C24? |
My understanding is that the goal was a C12mx900 for GEFS testing and C24mx500 for GFS testing, so I was aiming to test that second configuration, not replace any existing regression tests. |
I think then I need to understand whether the same warmstart tests will be needed. The original issue is that we were asked to provided a configuration which would produce a MOM6 restart at hour=1 (I believe this was for Marine DA purposes). But because of the lagged-startup for MOM6 and the timesteps for MOM6, we can't produce a hour=1 restart for the really low resolution cases. Lagged-startup means that for initial runs (ie, starting from T,S), MOM6 does not advance until the second coupling period. At the second coupling period, it advances two timesteps. After than, all timesteps/coupling is as specified. So since MOM6 doesn't actually advance until hour=2, we can't produce a hour=1 restart. That meant we needed a special warmstart/restart test. Those are set up to actually run from 032306, 24 hours later than all other tests. The input they use is created by running the control test for 24 hours and copying the warmstart files into the input-data directory. |
Ah, I see your point, thanks for explaining. Since the goal is to avoid running the C48 tests, but we need these restart files, we have some choices to make. Could we change the time step of MOM6? Or do we need to create that special warmstart/restart test for the other ocean resolutions as well? Does the DA team want these restart files for both resolutions, or would they be okay with just one? |
I don't think we want MOM6 to be any slower (ie, cut the timestep) since right now I have the initial 9deg configuration running at 1 hour and it can probably be much larger actually. I'll just set up the same control/warmstart/restart configurations for C12mx900 and we'll add/subtract the desired configurations before commit. |
@DeniseWorthen with 9x9 super coarse resolution, you can use at least 4hr as its time step. BTW, where did you get MOM_input ? |
@jiandewang Understood, but then we can't produce an hour=1 restart at all. EDIT: The 9-deg is the 5-deg w/ the resolution changed and (right now) using a z-initialization. The original 5-deg has a MOM (and CICE) restart generated from the Marine DA side |
@DeniseWorthen thanks for the explanation. So 1hr time step for ocean will be the max. otherwise you can't generate hour 1 restart file. |
@LarissaReames-NOAA I'm looking at the configuration in this directory:
In input.nml and model.configure, I see these settings;
ATM should then need 2x1x6 + 2x1 tasks = 14. But in ufs.configure you have
? |
@DeniseWorthen Ah, for that run I'd turned off quilting, hence only 12 PETs used. It looks like I had 14 in the job_card, but I think it ignores an extra processes assigned to the job at the top level. |
Does quilting=false turn off the write-grid component? |
Yes. |
@LarissaReames-NOAA I've run into an issue trying w/ your low res branch and mine. My job kept turning up a seg-fault with the error
and I couldn't figure out why you weren't getting the same error. I can see your lowres_rt is trying to do a debug build (
|
@DeniseWorthen Good catch. I'd forgotten to change the field_table for the coupled regression test. I changed it to
As an aside: Something we might consider is modifying the build script to not overwrite DCMAKE_BUILD_TYPE if DDEBUG isn't set as the former is a standard CMAKE option. Currently if DDEBUG isn't set, it defaults to OFF and sets DCMAKE_BUILD_TYPE=Release even if DCMAKE_BUILD_TYPE=Debug already. |
I'm not sure setting a CMAKE_BUILD_TYPE in the actual rt.conf is expected. Maybe that applicable for build.sh ? @DusanJovic-NOAA would know. It seems the issue w/ CICE is because of land mismatches in between ATM/ICE. Looking at the regridStatus field from CMEPS (set write_dstatus = true in the MED attributes), I can see that there are unmapped points (the pink ones), which is probably sending something invalid to the ICE. Let me look at the fractional land mask for ATM. It should be mapping from all points which are <1. |
Currently, we only use DEBUG flag in rt.conf to specify whether to build the executable in debug mode or not. It is used for historical reasons, the same flag was used even before we used cmake as a build system. We could get rid of it and switch to just use cmake specific flags, 'Debug' or 'Release' and set CMAKE_BUILD_TYPE directly in rt.conf, if that's what people prefer. |
I don't have any problem with continuing the use of DDEBUG, but I think having the build system not override user-provided DCMAKE_BUILD_TYPE would be a very tiny but useful change. Or am I mis-understanding how it works and the two are not interchangeable? |
@LarissaReames-NOAA We're missing a |
@DeniseWorthen I thought it might be something like that. Thanks for debugging. After the discussion on slack about export_fv3_v16 going away soonish, I changed the RT to use export_fv3 and modified the scheme/test accordingly. Main change was just switching to Thompson MP. I compiled the RT with -DDEBUG=ON and now have a successful C12 run here: I'm guessing that frac_grid option is probably set correctly in export_fv3. |
You'll need a place to slot in the variable to the
|
Description
In order to test changes to the global-workflow system we have to run the cycled system. Currently this is done with a C48 atmosphere and 127 level / 5 degree ocean model configuration. However this is not fast enough for performing the testing and is limiting the number of pull requests that can be merged in global-workflow. Additionally the ensemble and deterministic forecasts are both run using C48, meaning we are not replicating the dual resolution setup that is used in production.
Note that the configuration does not have to produce anything scientifically sound, just be able to reliably run in order to test the connection between tasks of the workflow.
Solution
Note that the ocean can already by run at 5 degrees.
The data assimilation group at EMC will work on the enabling these configurations with GSI and JEDI.
The text was updated successfully, but these errors were encountered: