-
Notifications
You must be signed in to change notification settings - Fork 145
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Airs converter docs #639
Airs converter docs #639
Conversation
A new perfect model obs experimental capability is added to HydroDART. The python and shell scripts use a user-defined truth run to sample psuedo-observations and run OSSEs. The following scripts have been tested in regional hydroDART runs such as the DRB domain. 1- gen_truth.sh: Starting from a list of wrf-hydro restarts (truth), strip out streamflow information at all available routelink gauges or at a subset of gauges specified by the user. The resulting netcdf files are stored in a new directory. 2- gauges.sh: A script to handle the gauges in the domain. It can be used to identify a subset "desired" set of gauges or retrive all gauges available in wrf_hydro's Routelink file. 3- pmo_osse.py: This iterates over the truth files obtained by running 'gen_truth.sh' and builds a set of obs sequence files. These are then used by the hydroDART famework to conduct an OSSE. Credits to Ben Johnson who built initial versions of this script. PS. This could have been done by building a wrapper around DART's own perfect_model_obs routine. The reasons why we went this route are unknown at this time.
Streamflow obs converter is modified to allow for better diagnostics. The changes allows DART to compute obs space diagnostics on all gauges from the Routelink (not only the ones specified by the user). This essentially mimcs evaluate_only functionality for non-idenity obs. There is also a change to check for bad (inf/nan) streamflow obs. A final change allows processing a large number of obs-seq files at the same time. The python scripts include bug fixes for file movement. It also allows to call the converter from the yaml file using fat memory nodes.
Changes to the model_mod and noah_hydro_mod to enhance performance when running a full CONUS domain. Various parts of the noah_hydro_mod code are rearranged to avoid extra computations that are not needed when the LSM is turned off. Also, the link tree structure is modified such that only the individual upstream links from any reach are stored. We also limit the number of upstream links to '5'. This helps alleviate any issues in the Routelink file (which can make the stream network unphysical). This change makes our Along-The-Stream (ATS) Localization runs a lot faster. Changes in the model_mod makes the code faster when distributing the close by reaches to each individual task. Also, there is a change that allows reading climatology files when a hybrid EnKF variant is invoked.
Modifications to hydroDART's diagnostic routine now allows the following: 1. Save the hydrographs in a high-resolution pdf 2. Handles hybrid DA components (weight mean and sd) 3. Better parsing of the netcdf files 4. Allows the openloop to have different ens size (compared to the regular DA runs) 5. Better time handling 6. Collects more information about the openloop run 7. The openloop may have different gauges than the DA runs 8. Better usage of matlab functions (removing obselete ones) 9. Separate plots for the hybrid statistics
Build instructions are avaiable from hdfeos.org
The HPSS was decomissioned in 2021
Only the two options are needed for the converter. rttov12 and rttov13 have different namelist options, so running rttov13 with the whole rttov12 namelist causes an error on reading the rttov namelist
Removed redundant documentation that was in the readme and the individual convert docs. Removed the various HDF5 vs HDF4 disussion and references to using a dart provided script to build hdfeos (get instructions directly from hdfeos)
and removed dangling heading
Todo: add fix for #609 to this pull request. |
Note to reviewers, the documentation build for this pull request is available here: |
@braczka can you take a look at this pull request and see if it makes sense for a user. Cheers, |
non-working code.
@hkershaw-brown Looking through this a bit today -- for reference could you provide the AMSU/AIRS data file(s) you used for obs converter testing? I could download them myself, but the link at least for AMSU is broken, and want to make sure I am looking at the correct data files. |
Hi Brett, this is a good review comment. We need to point to where the data is, this is the broken link right? Here is some data Nick ws using (original reporter of the hdfeos library-dart bug): |
Thanks Helen -- just some clarification on the review -- did you want me to stick with the software side i.e. make sure I can build and use the converters without any issues only. The existing science/data documentation (ie. discussion of satellites, L1-L3 products) that we inherited from existing documentation could be streamlined and updated, but that would not necessarily have any bearing on the software functionality. |
Hi Brett, definitely the documentation, that is what this pull request is about. https://dart-documentation--639.org.readthedocs.build/en/639/observations/obs_converters/AIRS/README.html |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@hkershaw-brown I committed a bunch of docs changes already to this PR. Sorry -- didn't mean to be so heavy-handed. I actually meant to request permission before pushing those commits. I think you will be OK with them, as they are primarily related to scientific documentation only, not software. I am having trouble with hdf4 to netcdf conversion step for the convert_amsu_L1 converter. See my comment there -- otherwise things look good.
- 2. Compile the the h4tonccf_nc4 tool following the instructions provided on the hdf-eos | ||
site. Compiling requires both HDF4 and HDF-EOS2 libraries. For Derecho the HDF4 | ||
libraries are accessed through the ``module load hdf`` command. HDF-EOS2 libaries are | ||
provided at ``/glade/campaign/cisl/dares/libraries/``. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I need more guidance on building the h4tonccf_nc4 tool to perform the hdf-eos to netcdf conversion step. On the website, the binary is provided for the Linux CentOS7 and macOS operating systems, but not sure if this is appropriate for Derecho with SUSE. Also requires additional configuration file to build on autoconf system, so want to make sure I am headed down correct path.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I downloaded the CentOS7,
wget https://hdfeos.org/software/h4cflib/bin/linux/v1.3/CentOS7/h4tonccf_nc4
Ran it on Derecho, seems ok:
chmod +x h4tonccf_nc4
./h4tonccf_nc4
Usage: h4tonccf [hdf file name] or h4tonccf [hdf file name] [netcdf file name]
Note: If the netcdf file name is not specified, the hdf file name will be used instead with .nc as extension.
./h4tonccf_nc4 AIRS.2022.01.01.120.L2.RetStd_IR.v6.0.33.0.G22235231529.hdf
Done with writing netcdf file AIRS.2022.01.01.120.L2.RetStd_IR.v6.0.33.0.G22235231529.nc.
Sorry, I confused myself by knowing just the wrong amount about the connection between AIRS and RTTOV. |
Co-authored-by: Brett Raczka <[email protected]>
Co-authored-by: Brett Raczka <[email protected]>
Co-authored-by: Brett Raczka <[email protected]>
Co-authored-by: Brett Raczka <[email protected]>
@mjs2369 Do you know why the doc build failed for a few of the latest syntax fixes? |
It's because I triggered the builds all at once by pushing the commits in succession in 10 seconds or less. So the doc builds for the first three commits didn't have time to finish building and were cancelled when the next one was triggered. If you look at the details of the checks for the docs, it shows the error message that says 'The user has cancelled this build.' So anytime you push multiple commits within the time it takes for the docs to finish building (usually 1-2 mins), this will happen and is nothing worry about. That is also why the last commit in that series had all checks pass successfully. You can always see the details of the actions checks by clicking on the little checkmark or red x next to the commit and then clicking Details on the check you're interested in. In this case, doing so for the failed doc builds should take you to a page like this: https://readthedocs.org/projects/dart-documentation/builds/23879246/ |
OK -- sounds like its pretty normal, although when I do check out the details all it says is the user cancelled the build, which is misleading. The only reason I ask is that @kdraeder is reviewing right now, and could be a little confusing in that the latest doc build won't include those syntax changes (but they will be included in the file changes) |
I can understand how it might be a little misleading since I didn't directly 'cancel' the builds. For ReadTheDocs, pushing another commit before the build finishes effectively cancels it. The latest doc build (which finished successfully) will include the syntax changes from all the previous commits as well. So there shouldn't be any inconsistencies between the file changes and the latest build of the documentation. I checked this earlier to make sure everything formatted correctly on the website and all the changes were present. |
Condensed the description of the vertical levels and layers. Added field names in the AIRS files which define them. Changed 'mb' to 'hPa'. Removed warnings in the namelist table about directory names in l2_files names. Example contents have ../data/file_name. Utilities_mod.f90 says "!> We agreed to support filenames/pathnames up to 256." I found no checks into whether there's a directory in the pathname, and no directory name is supplied to add to l2_files elements. Updated namelist to specify default version to be 7.
Co-authored-by: kdraeder <[email protected]>
…e of the program is convert_amsu_L1, not convert_amsua_L1
Create citation.cff for DART
Recent WRF-Hydro Developments
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
All ready to go. Good work on this everyone!!
Description:
The AIRS and AMSU-S had some out of date build suggestions for HDF EOS, and some scripts that reference the HPSS (decommissioned in 2021)
DART requires a patch to the HDFEOS library to successfully run the converters.
This pull request is to update the documentation for building the converters using the patched library, updated mkmf.templates and pointing people to the dart-built hdfeos libraries on Derecho.
I've removed the unnecessary entries from obs_def_rttov_nml in the input.nml. Only the two options are needed for the converter. rttov12 and rttov13 have different namelist options, so running rttov13 with the whole rttov12 namelist causes an error on reading the obs_def_rttov_nml namelist
L1_AMSUA_to_netcdf.f90 does not work. I've removed references to it. I think the code should be removed. Edit: I have removed it.
DART/observations/obs_converters/AIRS/L1_AMSUA_to_netcdf.f90
Lines 60 to 62 in 77bb8c2
Fixes issue
Fixes #590
Fixes #534
Types of changes
Documentation changes needed?
Tests
Please describe any tests you ran to verify your changes.
Checklist for merging
Checklist for release
Testing Datasets