make_ics (chgres_cube) fails with RAP initial conditions #15
Replies: 16 comments 4 replies
-
Hello @mefrediani Could you let us know the following information (or as much of it as possible) so that we can look into this for you?
While you are waiting for us to get back to you, you could perhaps try HRRR ICs data. I have often seen HRRR used for initial conditions and RAP used for boundary conditions with the SRW App. Best, |
Beta Was this translation helpful? Give feedback.
-
Hi @gspetro-NOAA, thank you for helping me with this issue. I'm running the SRW App on Cheyenne. I compiled the develop branch bd80d942 (First iteration of overhauling WE2E tests (#686)). My shell is bash and the modules loaded are the following:
I'm using a small grid over Colorado and I also tried the pre-configured grid over Indiana (SUBCONUS_Ind_3km) that comes with the App. The issue occurs in the data pre-processing stage, at task make_ics (when running the executable chgres_cube). The reason I'm not using HRRR data is because that is not available for the 2012 case we need to run. Indeed, it looks like the LBCs are created without any issues with RAP and NAM. If you have access to glade, you can see my configuration and log files here: Thank you so much! |
Beta Was this translation helpful? Give feedback.
-
Hi Maria @mefrediani, I remember running in to similar issues using RAP data with WRF/WPS and having to supplement with GFS data for soil state. While chgres_cube doesn't have the ability to specify that individual fields come from different sources like that, you can process the atmosphere and surface data separately so you could used RAP data for the atmosphere and GFS data for the surface and soil state. The workflow doesn't allow that, but you could certainly do it manually. Does that sound like a workable solution? Larissa |
Beta Was this translation helpful? Give feedback.
-
Hi Maria, If this solution works for you, you could either run chgres_cube in a standalone fashion, or run the App, but manually modify the chgres_cube namelist so that it processes only surface or atmospheric information when run: convert_atm - Set to ‘true’ to process atmospheric fields The NAM grib2 files contain soil information for those cases, so you can use them, but we'd need to help you with the Q2M field. That shouldn't require much modification to the chgres_cube code (we've done it many times before to account for different variable naming conventions in the grib2 files). The easiest solution would be the GFS+RAP combination, though. Also, the 'set_to_fill' option in the VARMAP table won't work for soil variables since those fields are required by the FV3 dycore at initialization. |
Beta Was this translation helpful? Give feedback.
-
@LarissaReames-NOAA and @JeffBeck-NOAA thanks for you suggestions! I modified fort.41 to process only surface fields but when I try to run chgres_cube, it crashes with the same error I got while using RAP:
However, when I use wgrib2 -s to list the file contents, the soil variables are there: If it is helpful, my namelist is here: /glade/work/frediani/Projects/UFS-fire/UFS/esmf-repo/expt_dirs/2012_CO-GFS/2012062518/tmp_ICS Thanks! |
Beta Was this translation helpful? Give feedback.
-
@LarissaReames-NOAA and @JeffBeck-NOAA, Do you have any suggestions to fix the issue with the GFS data? Thanks |
Beta Was this translation helpful? Give feedback.
-
@mefrediani @LarissaReames-NOAA @JeffBeck-NOAA
The relevant chgres code is here: sfc_input_data:1849-1859 wgrib2 to the rescue! You can manually change these entries to rename the variable name which also modifies the PDT and discipline values.
Note that the inventory numbers are different. This is because the original inventory file has subfields, e.g.
These get flattened out when running through wgrib2. Someone with a little more expertise may know how to keep these, but I don't think it matters. Just be mindful of this when using |
Beta Was this translation helpful? Give feedback.
-
@mefrediani For that, you may need to write some code. It is probably easier to hack the Fortran code instead. Looking at lines 5368-5395 of input_data.F90, make the following changes
That should set all of your snow depths to 0. Something else you may run into is that this old GFS file doesn't have vegetation type, soil type, and a few other parameters that this subroutine looks for. These are all optional, but for a few, chgres needs to be told to assume climatology. It looks like these are climatology by default, but it's possible we may need to set an environmental variable in a script if chgres chokes on one of these. |
Beta Was this translation helpful? Give feedback.
-
Probably obvious, but I should add that after you make the change, you will need to recompile. You can do this with |
Beta Was this translation helpful? Give feedback.
-
@DavidHuber-NOAA @LarissaReames-NOAA @JeffBeck-NOAA Now I have the surface fields from GFS (out.sfc.tile7.nc) and the atmospheric fields from RAP (out.atm.tile7.nc). I also have two files with the same name from each ICS: gfs.bndy.nc and gfs_ctrl.nc. What would be the best way to run the forecast from here - should I tweak the workflow or configure something "manually"? How do I configure the model to read the sfc and atm ICS? |
Beta Was this translation helpful? Give feedback.
-
Hi @DavidHuber-NOAA, @mefrediani, my apologies as I have been out on vacation. I'm really glad to hear you were able to process the RAP and GFS files separately to generate the atmospheric and surface files! In order for this to run in the App, and if you're running a single simulation for now, I would recommend recreating your experiment and simply bypassing the chgres_cube tasks (make_ics and make_lbcs). You would then place files in your YYYYMMDDHH/INPUT directory where you're running the experiment as follows: out.sfc.tile7.nc as sfc_data.nc Then you'll need to manually boot the run_fcst task to start the simulation. Hopefully that'll work! |
Beta Was this translation helpful? Give feedback.
-
@JeffBeck-NOAA the model is crashing with the following message:
I had to make some assumptions for the file names to run it through the SRW App, as the naming conventions were slightly different. I wonder if I made a mistake there or if it's something else. In the INPUT directory I manually created the following symlinks:
and the SRW App created the following:
Basically,
Do you have any thoughts about what can be wrong? |
Beta Was this translation helpful? Give feedback.
-
@mefrediani, a couple things: 1) did you confirm that the x,y arrays are the same in the sfc_data.nc file as in the gfs_data.nc file, and that they match what is expected in the domain you're trying to run? 2) GFS data uses Noah (four soil layers). What LSM are you using when running the model? Can you provide your fort.41 namelist file for chgres_cube and your input.nml file for the FV3-LAM? Thanks! |
Beta Was this translation helpful? Give feedback.
-
Hi @JeffBeck-NOAA,
When I use ncview to investigate the fields, I see that gfs_data.nc and sfc_data.nc have a geolon and geolat field. The field ranges match, but the dimensions don't. In gfs_data, geolon has 209 x 199 whereas in sfc_data, 210 x 200. In gfs_data, there are also variables called geolon_s and geolat_s (s for surface perhaps?) but the sizes still don't match: geolon_s and geolat_s have 209 x 200. I'm using the same target grid to create both of these files, so I don't know if there's anything wrong there. Here's my fort.41 namelist for sfc_data.nc: And for gfs_data.nc (which is generated from RAP for atm levels): In input.nml, I can see that the LSM is actually RUC, with 9 soil layers. I tried to change the lsm, lsoil, and lsoil_lsm to 1, 4, 4, respectively. I didn't rerun the pre-processing tasks of the workflow, though. I changed the values directly on input.nml and ran the run_fcst_mem000 task with rocotorun. It crashes with It doesn't seem that I can change lsoil and lsoil_lsm by themselves. I tried to keep lsm = 3 and change lsoil and lsoil_lsm to 4 but it crashes with Could I change something on fort.41 for chres_cube to write 9 soil layers? It doesn't really matter what LSM this simulation runs with. My original input.nml file is attached (I thought it was too long to paste it here). Thanks! |
Beta Was this translation helpful? Give feedback.
-
Hi @mefrediani, the easiest option is to rerun the model with Noah LSM. Remove the kice namelist entry, set lsm=1, and set lsoil and lsoil_lsm to 4. Then give that a try. |
Beta Was this translation helpful? Give feedback.
-
@mefrediani, are you using a new CCPP SDF XML file that has Noah in it? If not, you might want to recreate an experiment that uses GFS physics (for example). |
Beta Was this translation helpful? Give feedback.
-
I'm trying to run a 2012 case using RAP data but the workflow crashes during make_ics because it doesn't find soil data. I also tried it with a random date in 2016 to see if it was a RAP version issue but it still crashes.
The message is this:
I downloaded the RAP data from https://www.ncei.noaa.gov/has/HAS.FileAppRouter?datasetname=RAP130&subqueryby=STATION&applname=&outdest=FILE
which is the only place I found files for 2012-06-25 (rap_130_20160625_0000_*.grb2)
I also tried to change the parameter for soilw and soilt in GSDphys_var_map.txt to "set_to_fill" but the code still tried to read the number of soil levels, so it still crashed.
I found a related post in the old forum (https://forums.ufscommunity.org/threads/using-operational-hrrr-data-initialize-ufs) but I don't think I can find a different RAP product to simulate this case.
I'm wondering if there's a way I could use different data sources, similar to what is be done in WRF (WPS, ungrib), where we can process IC from multiple sources when the primary source doesn't have all the fields. If not, what are my options? I don't need to use a specific LSM scheme and having soil levels is important for what I'm trying to do.
My constraint is the simulation resolution. I need to run UFS at a resolution of ~ 3km and that's why I'm using RAP.
Any suggestion would be appreciated. Thanks!
Update: I also tried to simulate this case using NAM but it fails when reading Q2M.
I'm developing a component for UFS and we have this case specified in our proposal. I appreciate any sort of help on this. Thanks!
Beta Was this translation helpful? Give feedback.
All reactions