Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ROMS-CICE coupling for Bering Sea #5

Closed
saeed-moghimi-noaa opened this issue Oct 2, 2023 · 64 comments
Closed

ROMS-CICE coupling for Bering Sea #5

saeed-moghimi-noaa opened this issue Oct 2, 2023 · 64 comments
Assignees
Labels
enhancement New feature or request

Comments

@saeed-moghimi-noaa
Copy link

saeed-moghimi-noaa commented Oct 2, 2023

@SmithJos13! , @sdurski , @kurapov-noaa @uturuncoglu @pvelissariou1 @hga007

Description

OSU and OCS is working together towards coupling CICE and ROMS using UFS-Coastal code base

Required configurations

  • CDEPS-CICE
  • CDEPS-ROMS-CMEPS-CICE
@saeed-moghimi-noaa saeed-moghimi-noaa added the enhancement New feature or request label Oct 2, 2023
@saeed-moghimi-noaa
Copy link
Author

FYI @BahramKhazaei-NOAA

@saeed-moghimi-noaa
Copy link
Author

Hi @uturuncoglu

Please add the steps that you suggested in the meeting for @SmithJos13! to look at.

Thanks,

@uturuncoglu
Copy link
Collaborator

uturuncoglu commented Oct 2, 2023

@saeed-moghimi-noaa @SmithJos13 Thanks to create this issue. Here is the items that we need to check,

  • datm+cice configuration:

    • There are couple of datm forced configurations under https://github.com/oceanmodeling/ufs-coastal/tree/develop/tests/tests such as https://github.com/oceanmodeling/ufs-coastal/blob/develop/tests/tests/datm_cdeps_bulk_cfsr. This also has other component like MOM6. Using this test as a reference we need to create one for Bering Sea.
    • This requires to set-up offline CICE configuration that works for this domain
    • Then, using input files of the offline configuration the new case can be build from scratch
    • Some questions:
      • Which forcing will be used? Do we have existing data mode in CDEPS side to support it?
      • Which fields needs to be passed to coupled CICE configuration? Since example case also includes MOM6 but we don't in the initial configuration, any files for CICE required from active ocean needs to be passed with data model. Maybe we would need two data component datm and docs for this.
  • datm+cice+roms configuration:

    • I know ROMS uses its internal CICE cap for coupling but this application will use external CICE and its cap. So, we need to find out what is the main difference between those caps
    • Maybe we could start with the CICE cap provided by the ufs-coastal and try to run it that. If we need any additional feature from ROMS/CICE cap, that needs to be ported to main CICE cap found in the UFS Weather Model and merge with the authoritative repository.
    • Again, we would not want to use any custom development under UFS Coastal and all the development in terms of CICE needs to go its authoritative repository. It is hard to work with multiple NUOPC caps and maintain them. We would not want to break interoperability among the components.

@hga007
Copy link

hga007 commented Oct 2, 2023

@saeed-moghimi-noaa @uturuncoglu, Yes, I agree with Ufuk. It is much better to use the CICE component available under the UFS. I am unsure what version of CICE it is, but I assume it will be easier to update to CICE6.

We want to add a test case for the CICE-ROMS coupling and ROMS native sea-ice model. I already built the grid for Lake Erie.

@uturuncoglu
Copy link
Collaborator

@hga007 JFYI, it 's CICE6 and compatible with CMEPS.

@saeed-moghimi-noaa
Copy link
Author

@uturuncoglu @pvelissariou1 @SmithJos13

Hi Ufuk

Do you have any recommendation for Joe on where he can start to compile CICE plus other components needed for the initial test?

Thanks,
-Saeed

@uturuncoglu
Copy link
Collaborator

@saeed-moghimi-noaa At this point, we have no ant application under UFS-Coastal that builds both CICE, CMEPS, CDEPS and also ROMS. Let me create that. At least that will enable to create executable and can be used for different setups. I suggest following steps: (1) start with DATM+CICE first (I also need to update CMEPS to allow this coupling), (2) then include ROMS like DATM+CICE+ROMS but without any connection between CICE and ROMS and (3) create connection with CICE and ROMS. This allow us to build the configuration from simple to more advanced and see the missing pieces and fill them.

@SmithJos13
Copy link

Hi @uturuncoglu,

I have started trying to get the most recent version ESMF library compiled on a local machine that we have at OSU and I have been running to some issues. The system that I'm trying to compile it on is an intel machine running linux. The Fortran/cpp/c compiler that I am working with is intel v19.1.0 (They are all from the same library). And I'm trying to compile the ESMF using openmpi 3.1.3.

I have set the ESMF_DIR variable to the director where I downloaded the git repository and I have set ESMF_OS=linux, ESMF_COMPILER=intel, and ESMF_COMM=openmpi.

I have set the paths to all my compilers under the openmpi heading in '~/build_config/Linux.intel.default/build_rules.mk' .

The issue that I have been having is it seems like the make file is not able to locate an object file called 'binary.o' . The error is,

'g++: error: minimal: No such file or directory'

I have checked in the folder and there is no file by that name there which I suppose make sense give then error I am getting. I'm wondering if there is another variable I need to tick or enable when trying to compile openmpi? maybe something isn't getting copied that should be? I'd appreciate any suggestions that you have in how I can fix this.

Since I was having some issues with building the openmpi version of the library I tried to compile the mpiuni version of the library as a test case to make sure all my compilers where in order and I was able to successfully able to build that. However I ran into some errors during the verification step of installation. when I run

'make ESMF_TESTEXHAUSTIVE=ON unit_tests_uni'

I get the follow error

/home/server/pi/homes/smithj28/esmf/src/Infrastructure/FieldBundle/src/ESMF_FieldBundle.F90:92.46:

character(len=:), allocatable:: encodeName! storage for packed FB

I'll need to look into this one a bit further. To determine what is going on here. Maybe you have some ideas as well?

I look forward to hearing what you think might be going on. Hopefully I didn't bombard you with too many questions...

Thanks,
Joseph

@hga007
Copy link

hga007 commented Nov 1, 2023

@SmithJos13, @uturuncoglu. In the past, we had a lot of issues when ifort and g++ (or gcc) were combined. We needed to get the appropriate versions. Do you have access to icc? Also, the OpenMPI complicates matters. Nowadays, we prefer Spack/Stack, but it takes experience to install it.

@pvelissariou1
Copy link

@SmithJos13 If possible use INTEL > 20. As @hga007 mentioned we had similar problems using intel 19.x.x.x versions on certain HPC clusters, both during compilation and run time.

@uturuncoglu
Copy link
Collaborator

@SmithJos13 You might also want to check the following web page., https://oceanmodeling.github.io/ufs-coastal-app/versions/main/html/porting.html It mainly explains to install spack-stack and use it to run UFS coastal. It would be a good starting point to port the model to your platform. If you try and has some issue, let me know.

@saeed-moghimi-noaa
Copy link
Author

Hi @SmithJos13

Would you please update here where you are at with the ROMS-CICE coupling?

Thanks,
-Saeed

@sdurski
Copy link

sdurski commented Dec 19, 2023

Hi Saeed,

Joey is away this week. So I will try to answer.

We have not had success building the nuopc coupling layer on our local compute server. The rather dated OS on this server precludes installation of proper versions of some prerequisites for NUOPC. We are currently splitting the server in two so that the OS can be upgraded (this week) on one of it's boxes. Then current versions of all the dependencies can be installed. We expect to be able to use the suggested spack-stack approach to get everything built. Our computer support staff have already looked into that portion of the process and built components successfully.

We've asked about access to NOAA resources where NUOPC is already built to speed this process, but have not heard back yet. Running the coupled ROMS-CICE for the Bering Sea application will very likely require more resources than the single box of our compute server can handle, so I think getting that access will be important either way.

We have a standalone version of CICE 'fully optimized' for Bering Sea application and it is producing quite good results (when forced from above with ERA5 reanlysis, and below with ROMS Bering sea model output). I suspect that running a coupled ROMS-CICE using the MCT coupler in COAWST would be straightforward as we have COAWST, with that old style of coupling already up and running on a NASA machine that we have access to. But I understand that NUOPC is a central piece of what we're trying to test here.

I hope this helps. Please let me know if you have any other questions.

Scott

@saeed-moghimi-noaa
Copy link
Author

@sdurski

Hi Scott,

Thanks for the update. I have requested your access to HPC and followed up on that several times. I will try again today.

Thanks,
-Saeed

@SmithJos13
Copy link

Hi @uturuncoglu,

Our system has been finally upgraded here at osu which means that I have been able to try installing the ufs model dependences with docker + spack-stack. I've followed the directions as they are laid out in the link that you sent (https://oceanmodeling.github.io/ufs-coastal-app/versions/main/html/porting.html) however these seem to be building the ufs model with gcc rather than intel compilers I was under the impression that we wanted to build the ufs-model with intel compilers (maybe I misunderstood). Will it be able to adapt this method to intel compilers if we want to? or is there already instructions out for that?

Anyway, I have followed the directions as they are laid out and I was unsuccessful in building the model. I've rain into an issue with building the dependencies see the following two screen shots for the exact error:

Screenshot 2024-01-04 at 10 55 27 AM Screenshot 2024-01-04 at 10 56 05 AM

Do you have any idea what might be the issue?

also I think the following line in the directions that might have a typo. In section 3 the note that is talking about the proj error and how to fix it there is a line that reads

git cherry-pick upsream 149d194 (and resolve conflicts)

I think that it should be

git cherry-pick 149d194 (and resolve conflicts)

Terminal didn't seem to happy about there being upsream there.

Thanks,
Joey

@pvelissariou1
Copy link

Hi @SmithJos13 Joseph, for the argument mismatch in gcc/gfortan version >=10 you need to pass the flags: "-fallow-argument-mismatch -fallow-invalid-boz" to the compiler, until the offending fortran code is fixed. I don't know why you need to build mapl anyway.

@SmithJos13
Copy link

Thanks for the quick reply! Yeah I'm not sure about the either I think that it might a dependency for a dependency or something like that. So is there a particular place in spack that I need to feed this argument?

@pvelissariou1
Copy link

@SmithJos13 I went quickly through the spack-stack code tree and found other packages had the same issue. You might need to create a patch for mapl. If you look at spack/var/spack/repos/builtin/packages/parallelio/gfortran.patch you might get a clue how to incorporate the patch. I cannot help more, I haven't work in spack/spack-stack yet and I don't know much details. @uturuncoglu for sure can help more.

@hga007
Copy link

hga007 commented Jan 4, 2024

I always work with ifort. For ROMS-CICE coupling, we need to enhance the export/import states to include sea ice fields exchange in the ROMS standalone NUOPC cap module for the UFS (cmeps_roms.h) and add more entries to the coupling YAML file (roms_cmeps_*.yaml). I added plenty of documentation in https://github.com/myroms/roms_test/blob/main/IRENE/Coupling/roms_data_cmeps/Readme.md

@pvelissariou1
Copy link

@hga007 Hernan hi, I was trying to build ROMS and other models in UFS-Coastal and I encountered the following error (name conflict in libraries):

/apps/oneapi/compiler/2022.0.2/linux/compiler/lib/intel64_lin/libifport.a(abort.o): In function `abort_':
abort.c:(.text+0x20): multiple definition of `abort_'
ROMS-interface/ROMS/libROMS.a(abort.f90.o):/scratch2/STI/coastal_temp/save/Panagiotis.Velissariou/ufs-coastal-coastal_app/tests/build_fv3_coastal/ROMS-interface/ROMS/f90/abort.f90:1: first defined here

I don't know if you have seen this before, but it might be a good idea to rename "SUBROUTINE abort" in ./Utility/abort.F to something like roms_abort to avoid conflicts?

@SmithJos13
Copy link

@pvelissariou1 So I was looking at the file that you pointed me towards and seems like some of the flags that you have listed there are already incorporated in that file. I've tried adding " -fallow-invalid-boz" to the other flags already and I'm still running into the same issue. Is there some particular modification that I need to make to this file to get it to recognize these flags (I know that you just said that you haven't worked with spack-stack before so maybe @uturuncoglu can shine some light on this).

@SmithJos13
Copy link

Actually I was poking around in some of the files in spack/var/spack/repos/builtin/packages/mapl/ and the file in spack/var/spack/repos/builtin/packages/mapl/package.py there looks like there is some form of a patch to handle the gfortran compatibility version mismatch see the screenshot bellow:

Screenshot 2024-01-04 at 12 36 29 PM

So maybe there is something else causing the issue?

@uturuncoglu
Copy link
Collaborator

@SmithJos13 First of all, I did not tested custom spack-stack installation with Intel compiler yet. I have a version of Dockerfile that creates the Docker image and able to run RTs but this is again with GNU. I could push that file to ufs-coastal-app repo and you could see the commands that I am using but installing to exiting system could have little bit changes. If I remember correctly, I also have custom spack-stack installation that works on Orion. But again, it is hard to automatize everything and make it work in every custom platforms. It will be always have system issue. I could also try to extract the information from Dockerfile and create a new dependency installation script as much as possible but I need to work on it.

@uturuncoglu
Copy link
Collaborator

@SmithJos13 BTW, space-stack 1.5.1 is working for me (this will have ESMF 8.5.0). They are plaining to release also 1.6.0 to fix some issues but I am not sure its timeline.

@hga007
Copy link

hga007 commented Jan 4, 2024

For us, spack-stack 1.5.1 with JEDI and UFS support works well in my Linux Box using intel 2021.8.0 and ESMF 8.5.0. In our JEDI meeting today, they started talking about spack-stack 1.6.0. Dave Robertson is working to install 1.5.1 on our other computers, which is not trivial, and he has lots of experience. I hope that spack-stack will be more stable in the future and require less frequent updates.

@uturuncoglu
Copy link
Collaborator

@hga007 Maybe I could try to create script to automatize the installation. Then you could try in your side with help of Dave Robertson. If we need some change then we could incorporate it and update. Do you think it can be done? I know you might always manual intervention to the script for specific GNU or Intel version but we could do our best.

@SmithJos13
Copy link

Thanks @uturuncoglu @hga007 I suppose that I'll try work on installing the 1.5.1 version of the spack-stack. I'm a little new to using github... So if I wanted to do that, do I just have to check out the 1.5.1 version of spack-stack and then run "git submodule update --init --recursive" then proceed with the install instructions?

@uturuncoglu
Copy link
Collaborator

@SmithJos13 Yes. Let me know if you need help.

@uturuncoglu
Copy link
Collaborator

@SmithJos13 The installation processes in the https://oceanmodeling.github.io/ufs-coastal-app/versions/main/html/porting.html is not up-to-date and has couple of minor typos. So, please stick to Docker file way. In that case I am using lmod.

@SmithJos13
Copy link

@uturuncoglu Okay I wont look at the instruction in that link. Neglecting the issue tcl vs lmod I think I have installed the dependencies using spack-stack properly. Running the following command spack module lmod refresh returns the following:

==> You are about to regenerate lmod module files for:

-- linux-ubuntu22.04-zen2 / [email protected] --------------------------
qrlj37n [email protected]         gkbs7qp [email protected]         uxz7xo7 [email protected]    4dz5yiq [email protected]
as3bga3 [email protected]      4rtleko [email protected]           x2vc3ui [email protected]          zbq53zl [email protected]
vots4p7 [email protected]         7vwz2xv [email protected]          7ogdbbl [email protected]         bksx7tp [email protected]
wpwhcsg [email protected]      n4ju5jy [email protected]         ebu7o6q [email protected]           aqdjzwj [email protected]
ervjwhi [email protected]        kwrd27a [email protected]  ddft6li [email protected]           y7ynxfj [email protected]
4er464d [email protected]          samj44u [email protected]       b3mwaoe [email protected]          pbm3x2h [email protected]
iyenx7h [email protected]_emc  azt6nfe [email protected]            biqw4cd [email protected]  dre7ujh [email protected]
glxzbwd [email protected]          rmih7z7 [email protected]         hug3ikc [email protected]       npora6d [email protected]
kr7idsa [email protected]          aguvrlm [email protected]         nmdrd6j [email protected]             zgkz6ze [email protected]
bkrk5p3 [email protected]         lwofrkt [email protected]   5c6shhb [email protected]             quztabs [email protected]
wy6wzsw [email protected]     ayvxyiw [email protected]           sc5nq7w [email protected]       6iyjv5p [email protected]
x2e3jnb [email protected]      mmmtzee [email protected]     fmsa4r2 [email protected]              jsn5ytq [email protected]
xfqo3m5 [email protected]            v752wfl [email protected]         ubzllqd [email protected]     cgmzjgp [email protected]
o2y6hf4 [email protected]       ct5ujzg [email protected]      skkztha [email protected]       cqpjsma [email protected]
5m2u4mi [email protected]           3z6g2o5 [email protected]      3ekcbll [email protected]       5sd3vbs [email protected]
graz6k4 [email protected]      x7n3gcf [email protected]        kllgbjx [email protected]     vuimzqi [email protected]
3nmv5ub [email protected]         oglwmae [email protected]         ppdprun [email protected]         yvlksui [email protected]
rvqsfku [email protected]   xi4kgpz [email protected]             oal2zan [email protected]            oal2e5r [email protected]
zw7asim [email protected]          gdcpskb [email protected]             2whb7v2 [email protected]         rcywnkc [email protected]
gxmpvaf [email protected]       fo4azyx [email protected]           5dkshvs [email protected]     s7gfo5q [email protected]
j5elfyu [email protected]         jtlneao [email protected]         gczw7pf [email protected]        4jezwnc [email protected]
ongf6u4 [email protected]         n4r5nrz [email protected]           insttai [email protected]        phod4zk [email protected]
xfutlqu [email protected]            6u2x5ao [email protected]        ihlj53n [email protected]         e4f27s5 [email protected]
==> Do you want to proceed? [y/n] y
==> Regenerating lmod module files

which appears the be the a complete list of all the dependencies that I need in order to start porting the UFS model. The following command spack stack setup-meta-modules returns:

Configuring basic directory information ...
  ... script directory: /opt/spack-stack/spack/lib/jcsda-emc/spack-stack/stack
  ... base directory: /opt/spack-stack/spack/lib/jcsda-emc/spack-stack
  ... spack directory: /opt/spack-stack/spack
Configuring active spack environment ...
  ... environment directory: /opt/spack-stack/envs/ufs.local
Parsing spack environment main config ...
  ... install directory: /opt/ufs.local
Parsing spack environment modules config ...
  ... configured to use lmod modules
  ... module directory: /opt/ufs.local/modulefiles
Parsing spack environment package config ...
  ... list of possible compilers: '['[email protected]', 'gcc', 'intel', 'pgi', 'clang', 'xl', 'nag', 'fj', 'aocc']'
  ... list of possible mpi providers: '['[email protected]', 'openmpi', 'mpich']'
['openmpi', 'module-index.yaml', 'gcc', 'Core']
 ... stack compilers: '{'gcc': ['11.4.0', '11.4.0']}'
 ... stack mpi providers: '{'openmpi': {'4.1.5': {'gcc': ['11.4.0', '11.4.0']}}}'
  ... core compilers: ['[email protected]']
Preparing meta module directory ...
  ... meta module directory : /opt/ufs.local/modulefiles/Core
Creating compiler modules ...
  ... configuring stack compiler [email protected]
  ... ... CC  : /usr/bin/gcc
  ... ... CXX : /usr/bin/g++
  ... ... F77 : /usr/bin/gfortran
  ... ... FC' : /usr/bin/gfortran
  ... ... COMPFLAGS: 
  ... ... MODULELOADS: 
  ... ... MODULEPREREQS: 
  ... ... MODULEPATH  : /opt/ufs.local/modulefiles/gcc/11.4.0
  ... writing /opt/ufs.local/modulefiles/Core/stack-gcc/11.4.0.lua
  ... configuring stack mpi library [email protected] for compiler [email protected]
  ... ... MODULELOADS: load("openmpi/4.1.5")
  ... ... MODULEPREREQS: prereq("openmpi/4.1.5")
  ... ... MODULEPATH  : /opt/ufs.local/modulefiles/openmpi/4.1.5/gcc/11.4.0
  ... writing /opt/ufs.local/modulefiles/gcc/11.4.0/stack-openmpi/4.1.5.lua
  ... configuring stack mpi library [email protected] for compiler [email protected]
  ... ... MODULELOADS: load("openmpi/4.1.5")
  ... ... MODULEPREREQS: prereq("openmpi/4.1.5")
  ... ... MODULEPATH  : /opt/ufs.local/modulefiles/openmpi/4.1.5/gcc/11.4.0
  ... writing /opt/ufs.local/modulefiles/gcc/11.4.0/stack-openmpi/4.1.5.lua
  ... using spack-built python version 3.10.8
 ... stack python providers: '{'python': ['3.10.8']}'
  ... configuring stack python interpreter [email protected] for compiler [email protected]
  ... writing /opt/ufs.local/modulefiles/gcc/11.4.0/stack-python/3.10.8.lua

which is successful. Then I'm trying to load the module files using environment modules [version: module --version returns Modules Release 5.3.1 (2023-06-27)]. I've tried the following command to point module to the directory where these module files are stored: module use opt/ufs.local/modulefiles/gcc/11.4.0/ and then run module refresh. Typing module avail shows that there are no available modules from the directory (see bellow):

----------------------------------- /usr/local/Modules/modulefiles ----------------------
dot  module-git  module-info  modules  null  use.own  

Key:
modulepath  

which hasn't changed since I installed environment modules in docker.

I'm struggling with figuring out how to load in the modulefile that I have created using spack so that I can use environment modules inside docker.

Is there further steps I need to take the load the modules in? Do I need to use spack to load in the modules? Further is there any special instruction that I need to follow in order to port the UFS module to my local linux box? or should I be able to follow the steps as listed the following site: https://ufs-weather-model.readthedocs.io/en/latest/BuildingAndRunning.html

Thanks for all your help!

@uturuncoglu
Copy link
Collaborator

@SmithJos13 That is great. It is really great progress. Since this is a custom machine you need to create set of module files. Are you plaining to run regression tests too. If so, you also need to make couple of changes to run them. There are some commits that is did not next AMS short course related with land component in the following link to run RTs under Docker container,

https://github.com/uturuncoglu/ufs-weather-model/commits/feature/ams_course/

Please look at commits starting with Dec. 5. They could give some guideline to you. If you could not make it. Then, we could have a call and try together.

@uturuncoglu
Copy link
Collaborator

@rjdave I was looking to your instructions and I realized that you are not using system provided MPI. Am I wrong? It seems that you are building openmpi with the Intel compiler (system provided module). If so, these instruction will also need to include additional step to add system provided MPI to the spack externals YAML file. Anyway, I am very close to finalize the script for dependency installation. Once I tested with the exiting platforms, I'll let you know.

@hga007
Copy link

hga007 commented Jan 10, 2024

Dave lost electricity at his house because of yesterday's storm. So, he doesn't have internet. Yes, we use the OpenMPI versions on our computers to be consistent with all the other libraries that depend on it. When updating Spack-Stack, I believe he first tried to update the needed modules. We did that recently with atlas 0.35.0. I assume that he does the same with OpenMPI. Please remember that the MPI libraries are usually tuned to the particular architecture. Therefore, we shouldn't build from scratch.

@SmithJos13
Copy link

SmithJos13 commented Jan 10, 2024

@uturuncoglu

So does everything looks like it was successful from the information that I provided?

Also do you have any example of how these module files were created for other environments that I could work from? If there is any resources that you recommend that I look for doing this I'd appreciate it. I think once these module files are built then I can proceed with trying to install the UFS model!!

I'm a fairly novice unix user so I've been learning a lot going through this process (hence all the questions).

Thanks!

@uturuncoglu
Copy link
Collaborator

@SmithJos13 Here is an example that I am using under Docker container. https://github.com/uturuncoglu/ufs-weather-model/blob/feature/ams_course/modulefiles/ufs_local.gnu.lua You could create a new one for your system.

In my case, I am running regression test under Docker (it has slurm and also spack-stack installation) like following,

export MACHINE_ID="local"
export USER="root"
cd /opt
git clone -b feature/ams_course --recursive https://github.com/uturuncoglu/ufs-weather-model.git
cd ufs-weather-model/tests
source /etc/profile.d/lmod.sh
./rt.sh -a debug -k -n datm_cdeps_lnd_era5 gnu

Of course, this is for different configuration not related with coastal app and it also uses feature/ams_course fork of the ufs-weather-model that has extra changes related with the RT system but idea is same.

You might able to compile one of the coastal specific configuration like following (if you name your module file like ufs_local.gnu.lua) in your case but I think you still need to adapt RT system to make it work.

./compile.sh "ufs_local " "-DAPP=CSTLF -DCOORDINATE_TYPE=SPHERICAL -DWET_DRY=ON" coastal gnu NO NO

@pvelissariou1 @saeed-moghimi-noaa I am not sure but we might arise an issue to ufs-weather-model developers to have more flexible RT system that allows to bring new machines. Anyway, we could discuss it more in our next meeting.

@SmithJos13
Copy link

SmithJos13 commented Jan 11, 2024

@uturuncoglu Thanks for the information.
I think I've made a little more progress on this. I've been able to clone a clean version of the ufs-weather-model into a working directory on my local machine. ie git clone --recursive https://github.com/ufs-community/ufs-weather-model.git ufs-weather-model then I proceeded to make a module file for my local machine in /modulefile/ufs_local.gnu.lua in which contains all the information in the link that you supplied (https://github.com/uturuncoglu/ufs-weather-model/blob/feature/ams_course/modulefiles/ufs_local.gnu.lua).

Then I sourced the following source /etc/profile.d/lmod.sh so that I have a lua version of environment modules running (I might add this to my .bashrc in the future so I can skip this step. I then proceeded with specifying the folder where the modules are located using module use /opt/ufs-weather-model/modulefiles. Then calling module avail yields:

---------------------------------------------------------- /usr/share/lmod/lmod/modulefiles ----------------------------------------------------------
   Core/lmod/6.6    Core/settarg/6.6

Use "module spider" to find all possible modules.
Use "module keyword key1 key2 ..." to search for all possible modules matching any of the "keys".

root@8a7cc2fd8a3a:/# module use /opt/
/opt/intel              /opt/modules-5.3.1      /opt/scratch            /opt/spack-stack        /opt/ufs-weather-model  /opt/ufs.local
root@8a7cc2fd8a3a:/# module use /opt/ufs-weather-model/
/opt/ufs-weather-model/.git                /opt/ufs-weather-model/FV3                 /opt/ufs-weather-model/doc
/opt/ufs-weather-model/.github             /opt/ufs-weather-model/GOCART              /opt/ufs-weather-model/driver
/opt/ufs-weather-model/AQM                 /opt/ufs-weather-model/HYCOM-interface     /opt/ufs-weather-model/modulefiles
/opt/ufs-weather-model/CDEPS-interface     /opt/ufs-weather-model/MOM6-interface      /opt/ufs-weather-model/stochastic_physics
/opt/ufs-weather-model/CICE-interface      /opt/ufs-weather-model/NOAHMP-interface    /opt/ufs-weather-model/tests
/opt/ufs-weather-model/CMEPS-interface     /opt/ufs-weather-model/WW3                 
/opt/ufs-weather-model/CMakeModules        /opt/ufs-weather-model/cmake               
root@8a7cc2fd8a3a:/# module use /opt/ufs-weather-model/modulefiles/
root@8a7cc2fd8a3a:/# module avail

--------------------------------------------------------- /opt/ufs-weather-model/modulefiles ---------------------------------------------------------
   ufs_acorn.intel      ufs_expanse.intel    ufs_hercules.gnu      ufs_linux.intel     ufs_noaacloud.intel    ufs_stampede.intel
   ufs_common           ufs_gaea-c5.intel    ufs_hercules.intel    ufs_local.gnu       ufs_odin               ufs_wcoss2.intel
   ufs_derecho.gnu      ufs_hera.gnu         ufs_jet.intel         ufs_macosx.gnu      ufs_orion.intel
   ufs_derecho.intel    ufs_hera.intel       ufs_linux.gnu         ufs_macosx.intel    ufs_s4.intel

---------------------------------------------------------- /usr/share/lmod/lmod/modulefiles ----------------------------------------------------------
   Core/lmod/6.6    Core/settarg/6.6

Use "module spider" to find all possible modules.
Use "module keyword key1 key2 ..." to search for all possible modules matching any of the "keys".

So environment modules is able recognize the module files supplied by the ufs-weather-model. Then trying module load ufs_local.gnu yields:

Lmod has detected the following error:  Unable to load module: python/3.10.8
     /opt/ufs.local/modulefiles/gcc/11.4.0/python/3.10.8.lua : [string "-- -*- lua -*-..."]:20: attempt to call global 'depends_on' (a nil value)

While processing the following module(s):
    Module fullname      Module Filename
    ---------------      ---------------
    python/3.10.8        /opt/ufs.local/modulefiles/gcc/11.4.0/python/3.10.8.lua
    stack-python/3.10.8  /opt/ufs.local/modulefiles/gcc/11.4.0/stack-python/3.10.8.lua
    ufs_local.gnu        /opt/ufs-weather-model/modulefiles/ufs_local.gnu.lua

I suspect the issue is that I need to adapt the module file a little more for my machine, but I'm not really sure what needs to be modified in /modulefile/ufs_local.gnu.lua from just looking at it (besides the paths to the compilers).

@rjdave
Copy link

rjdave commented Jan 11, 2024

@rjdave I was looking to your instructions and I realized that you are not using system provided MPI. Am I wrong? It seems that you are building openmpi with the Intel compiler (system provided module). If so, these instruction will also need to include additional step to add system provided MPI to the spack externals YAML file. Anyway, I am very close to finalize the script for dependency installation. Once I tested with the exiting platforms, I'll let you know.

Since I have only been building spack-stack on single node machines I have been allowing spack-stack to compile Open MPI. I am currently working on getting spack-stack setup on our university cluster so I will experiment with the necessary steps to use system provided MPI.

@uturuncoglu
Copy link
Collaborator

@pvelissariou1 @janahaddad @saeed-moghimi-noaa i am planning to move this to ROMS repo since it is related with ROMS. I'll also open a new ticket for ROMS CICE coupling in general.

@pvelissariou1
Copy link

@uturuncoglu Yes, it is a good idea to separate all these. If we keep "ROMS-CICE coupling for Bering Sea" then this issue/discussion should be applied to this specific project only. I suggest to populate "Implementation of ROMS+CICE coupling" instead. We do have the same issue with some of our github issues where they are polluted with unrelated discussions. @janahaddad and I already talked about this, let's see what will be the best strategy to address this.

@uturuncoglu uturuncoglu transferred this issue from oceanmodeling/ufs-weather-model Mar 20, 2024
@uturuncoglu
Copy link
Collaborator

Okay. I transferred this to ROMS repo.

@janahaddad janahaddad moved this from In Progress to Todo in ufs-coastal project Apr 1, 2024
@janahaddad janahaddad moved this from Todo to Backlog in ufs-coastal project May 20, 2024
@SmithJos13
Copy link

Hi @uturuncoglu

I was wondering where the progress was in developing a configuration for the ufs-coastal model. I have been able download and build ufs-coastal on the Orion but I don't see an option to build CICE+ROMS+CDEPS coupled only. I was wondering if there is a hidden option or command that I need to use?

Thanks,
Joey

@pvelissariou1
Copy link

@SmithJos13 Unfortunately, there is no a CICE+ROMS+CDEPS test case in ufs-coastal yet.

@SmithJos13
Copy link

@pvelissariou1 I think that there might be a misunderstanding here. I was under the impression that CICE+ROMS+CDEPS has been built/implemented but it hadn't been tested. I'm not really looking for a test case instead I'd like to build the model in this configuration and start testing it in the Bering sea. When I have attempted this in the past there is not preconfigured app that will allow me to build the model in this configuration. If the model is not ready to be built like this I'd be interested to know when I would be able to do so?

@pvelissariou1
Copy link

@SmithJos13 Hi Joseph, I don't have an exact answer to your question. My understanding is that at this point there is no implementation of CICE+ROMS+CDEPS in ufs-coastal on the application level, and there is not a timeline set to do this. I know Hernan was working to configure a test case for ROMS+CICE (internal) on Great Lakes but I don't have any more info than that. @uturuncoglu will be back from his vacation at the end of this month to chime in. @saeed-moghimi-noaa , @janahaddad it might be necessary to have a discussion on ROMS+CICE+CDEPS implementation in ufs-coastal.

@SmithJos13
Copy link

Thanks @pvelissariou1. Okay that is unfortunate to hear about the CICE+ROMS+CDEPS coupling. I guess I will follow up when @uturuncoglu returns from vacation.

I have another question @pvelissariou1: Is there applications where I can build a CICE+ocn+CDEPS(DATM) configuration?

@pvelissariou1
Copy link

@SmithJos13 Currently CICE is not configured in ufs-coastal although the model itself is included in ufs-coastal. Somehow we had some urgent priorities with the ocean components and the cice implementation was pushed back. I don't know how difficult will be to configure and use ROMS(+internal CICE)+CDEPS, it has been quite a few years that I used ROMS. I believe, that this configuration can be done in ufs-coastal. For example, we do have configurations of SCHISM(+internal WAVES) forced by CDEPS atm. data in ufs-coastal.

@SmithJos13
Copy link

@pvelissariou1 Thanks for adding some clarification about what is going on on your end. The idea of ROMS(+internal CICE)+CDEPS sounds like the next best option for coupling in the way that I want to (depending on where the current development status is). I thought that ROMS+CDEPS was already configured in ufs-coastal? is there some tweaks that would need to be made to the NUOPC cap to get ROMS(+internal CICE)+CDEPS working? (I'm not an expert in NUOPC coupling so maybe there is a major thing that I am overlooking here).

@hga007
Copy link

hga007 commented Jun 19, 2024

@SmithJos13: ROMS is already configured for CDEPS and CMEPS within ufs-coastal. The issue is that CICE requires importing data from atmosphere and ocean components since, as far as I know, there is a lack of infrastructure to provide such fields besides coupling. To make things easier, you need congruent grids between ROMS and CICE. Otherwise, you will have to introduce data from another ocean model to provide CICE data in the points not covered by ROMS. Even if we have NUOPC modules for all the compiling components, we must have a coupling strategy between the different grids and use the DATA component (CDEPS) if required.

You could start simple for your desired application:

  • DATA-CICE coupling with CMEPS or CDEPS. The atmosphere and ocean forcing fields are from the DATA component.
  • DATA-ROMS coupling with CMEPS or CDEPS. The atmospheric forcing comes from the DATA component. Although ROMS has its input-forcing infrastructure, using the DATA component is a good idea because an actual atmospheric model can substitute it.
  • DATA-CICE-ROMS coupling with CMEPS or CDEPS. CICE and ROMS get the atmospheric forcing from the DATA component. CICE gets oceanic forcing from ROMS.

You need the CICE grid for your application for all these. I have never built one. Coupling is not trivial because it requires some knowledge of all components. Thus, we recommend starting from simple to complex.

@uturuncoglu
Copy link
Collaborator

@hga007 @SmithJos13 JFYI, we have some capability to filled unmapped regions with CMEPS mediator. Please check the following presentation - https://docs.google.com/presentation/d/1Tk76zlsRiT7_KMJiZsEJHNvlHciZJJBBhsJurRMxVy4/edit#slide=id.g2613ed2f8f8_0_0 This is initially used by HAFS application but it could be easily extent to ice coupling. Anyway, we need to discuss it with the Coastal team since it seems that workflow integration has priority at this point. @janahaddad maybe this could be a discussion item once I return next week.

@SmithJos13
Copy link

I would like some clarification about development progress on CICE development/implementation so that we are all on the same page here:

  1. What are the configuration with CICE which have been implemented and needs testing?
  2. Is there CICE+CDEPS/CMEPS capability?
  3. is there ROMS+CDEPS/CMEPS capability?
  4. Is there are project where this progress is listed?

I understand things are extremely busy and other aspects of the project are more pressing and I appreciate any answers people are able to provide on this

@uturuncoglu
Copy link
Collaborator

@SmithJos13 I'll try to answers as much as I can in an order,

  1. At this point, there is no any CICE configuration related with the coastal application and all the cases implemented in UFS Weather Model are global configurations.
  2. As I know there is no any CICE configuration (and RT) forced with CDEPS data atmosphere. There are global configurations under UFS Weather Model that uses CMEPS and as I know under CESM there are some cases also uses CDEPS. So, I am not expecting any issue in terms of using CDEPS and CMEPS along with CICE (found under UFS Weather Model). There is also an ongoing PR that brings CICE+WW3 coupling (but still in global domain). You could see it in here, Add regression testing for wave-sea ice coupling ufs-community/ufs-weather-model#2072. BTW, it would be nice to keep in your mind that ROMS has also its internal CICE that has completely different cap (not tested under UFS Weather Model). So, our aim is you use the CICE that is found under UFS Weather Model (also shared with NCAR's CESM). If there is some capability (i.e. regional coupling, additional exchange fields etc.) missing but found under ROMS/CICE, this needs to be ported to CICE version found in UFS Weather Model.
  3. As I told ROMS has CICE cap in it but I am not sure which CICE version is compatible with it. Again, even if it is working outside of the UFS Weather Model (or UFS Coastal App), it is not tested in UFS framework.
  4. I am not aware of any project but maybe others could give more information about it.

I hope it helps.

@janahaddad
Copy link
Collaborator

As discussed today we'll close this issue and continue ROMS+CICE progress on #4

@github-project-automation github-project-automation bot moved this from Backlog to Done in ufs-coastal project Sep 23, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
Status: Done
Development

No branches or pull requests

8 participants