-
Notifications
You must be signed in to change notification settings - Fork 168
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add a job to prepare ioda format dumps for use in atmospheric UFS-DA #1826
Add a job to prepare ioda format dumps for use in atmospheric UFS-DA #1826
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we need more information about what the exscript is actually doing, especially with the $COM_OBS
directory. In operations, that directory does NOT fall under GFS (it is an obsproc directory). If the script is placing output there (and I don't see any other COM directories being defined), that is probably a no-go. It might be okay temporarily for development, but we need to start thinking about making all this new JEDI stuff ops-ready.
A comment was added to issue #1820 briefly describing the exscript. GDASApp PR #575 provides additional details regarding bufr2ioda processing. I believe you are correct. The bufr to ioda processing in this PR is a temporary step for UFS-DA prototyping. Please contact @CoryMartin-NOAA, @emilyhcliu, and @ilianagenkova for long term (ie, operational) plans for processing observations using UFS-DA software. Is the above sufficient for the time being? You make a valid point, @WalterKolczynski-NOAA , regarding the relationship between ObsProc and GFS. g-w has long exercised pieces of ObsProc via |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I have several questions on this PR:
- see inline specific comments
- how does the ioda processing for atmosphere observations in general fit in the scope of the global-workflow? without knowing that the scope or expanse is, it is hard to provide robust review that may also overlap with similar activities that fall under "prep ioda observations for JEDI applications"
Thank you @aerorahul for your feedback. @CoryMartin-NOAA and @emilyhcliu will need to reply to some of your questions. I took their development and added hooks to exercise it from g-w. You ask excellent design questions which need to go back to source developers. |
@aerorahul as previously discussed, much of what is here is intended as a stopgap for prototype development and not intended for the final operational implementation. If you would prefer that we take all observation processing out of the workflow and just have the workflow handle ln/cp, we can do so. |
That is only partially true @CoryMartin-NOAA |
Rerun 20210814 18 gdas |
Note: Once GDASApp PR #601 is merged into GDASAPP |
…H in prepatmiodaobs.sh (NOAA-EMC#1820)
Test GDASApp branch The last update I am aware of for this PR is to update Does this PR require any additional revisions besides the updated GDASApp hash? |
GDASApp PR #601 was merged into GDASApp g-w points at GDASApp Done at 339c6a0. |
Hera and Orion tests Work for this PR is done apart from possible requests from reviewers. Re-reviews requested. @WalterKolczynski-NOAA , do I need to trigger any of the Orion or Hera CI tests using labels? |
This is an action we reserve for the g-w code managers. We'll take care of that if/when necessary. |
Got it. Thank you @WalterKolczynski-NOAA for the clarification. Is there anything you or @aerorahul need me to do for this PR? I'm done with the changes I intend to make. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks fine to me. Until CI exercises the UFS-DA jobs, I don't think there is much point in CI testing it.
The changes look good with the exception that all scripts should be part of the global-workflow if they are executed in the global applications. This has been established for a while now. |
|
Please advise as to the proper action to take to move this PR forward. Tagging @CoryMartin-NOAA, @emilyhcliu , and @guillaumevernieres for awareness. |
@RussTreadon-NOAA |
Thank you @aerorahul for approving this PR. Thank you @WalterKolczynski-NOAA for merging this PR into |
My input probably won't matter much by now, but I think both Cory and Rahul
are right. Some observations pre-processing might fall in the g-w, but it's
not been discussed in detail. Specifically for marine data, some decisions
are still to be made. Perhaps a meeting at some point would answer some
questions.
…On Thu, Sep 7, 2023 at 2:46 PM RussTreadon-NOAA ***@***.***> wrote:
Thank you @aerorahul <https://github.com/aerorahul> for approving this
PR. Thank you @WalterKolczynski-NOAA
<https://github.com/WalterKolczynski-NOAA> for merging this PR into
develop.
—
Reply to this email directly, view it on GitHub
<#1826 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AOC4YXQGAELYHZAMIFJAYXDXZIQBRANCNFSM6AAAAAA4BTIZK4>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
|
Description
Add a job to convert bufr format GDA dumps to ioda format for use in atmospheric UFS-DA.
Resolves #1820
Depends on
Type of change
This PR adds j-job
JGLOBAL_PREP_IODA_OBS
and rocoto driverprepiodaobs.sh
to the suite of cycled gfs jobs. Changes are made in workflow scripts to generate the xml required to run prepioodaobs. The new job is added toconfig.resources
along with the addition of a new config file,config.prepiodaobs
.At present GDASApp only converts bufr satwnd dumps to ioda format. Additional bufr dump types will be added in the future.
Change characteristics
How has this been tested?
Checklist