-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ROMS-CICE coupling for Bering Sea #5
Comments
Hi @uturuncoglu Please add the steps that you suggested in the meeting for @SmithJos13! to look at. Thanks, |
@saeed-moghimi-noaa @SmithJos13 Thanks to create this issue. Here is the items that we need to check,
|
@saeed-moghimi-noaa @uturuncoglu, Yes, I agree with Ufuk. It is much better to use the CICE component available under the UFS. I am unsure what version of CICE it is, but I assume it will be easier to update to CICE6. We want to add a test case for the CICE-ROMS coupling and ROMS native sea-ice model. I already built the grid for Lake Erie. |
@hga007 JFYI, it 's CICE6 and compatible with CMEPS. |
@uturuncoglu @pvelissariou1 @SmithJos13 Hi Ufuk Do you have any recommendation for Joe on where he can start to compile CICE plus other components needed for the initial test? Thanks, |
@saeed-moghimi-noaa At this point, we have no ant application under UFS-Coastal that builds both CICE, CMEPS, CDEPS and also ROMS. Let me create that. At least that will enable to create executable and can be used for different setups. I suggest following steps: (1) start with DATM+CICE first (I also need to update CMEPS to allow this coupling), (2) then include ROMS like DATM+CICE+ROMS but without any connection between CICE and ROMS and (3) create connection with CICE and ROMS. This allow us to build the configuration from simple to more advanced and see the missing pieces and fill them. |
Hi @uturuncoglu, I have started trying to get the most recent version ESMF library compiled on a local machine that we have at OSU and I have been running to some issues. The system that I'm trying to compile it on is an intel machine running linux. The Fortran/cpp/c compiler that I am working with is intel v19.1.0 (They are all from the same library). And I'm trying to compile the ESMF using openmpi 3.1.3. I have set the ESMF_DIR variable to the director where I downloaded the git repository and I have set ESMF_OS=linux, ESMF_COMPILER=intel, and ESMF_COMM=openmpi. I have set the paths to all my compilers under the openmpi heading in '~/build_config/Linux.intel.default/build_rules.mk' . The issue that I have been having is it seems like the make file is not able to locate an object file called 'binary.o' . The error is, 'g++: error: minimal: No such file or directory' I have checked in the folder and there is no file by that name there which I suppose make sense give then error I am getting. I'm wondering if there is another variable I need to tick or enable when trying to compile openmpi? maybe something isn't getting copied that should be? I'd appreciate any suggestions that you have in how I can fix this. Since I was having some issues with building the openmpi version of the library I tried to compile the mpiuni version of the library as a test case to make sure all my compilers where in order and I was able to successfully able to build that. However I ran into some errors during the verification step of installation. when I run 'make ESMF_TESTEXHAUSTIVE=ON unit_tests_uni' I get the follow error /home/server/pi/homes/smithj28/esmf/src/Infrastructure/FieldBundle/src/ESMF_FieldBundle.F90:92.46:
I'll need to look into this one a bit further. To determine what is going on here. Maybe you have some ideas as well? I look forward to hearing what you think might be going on. Hopefully I didn't bombard you with too many questions... Thanks, |
@SmithJos13, @uturuncoglu. In the past, we had a lot of issues when ifort and g++ (or gcc) were combined. We needed to get the appropriate versions. Do you have access to icc? Also, the OpenMPI complicates matters. Nowadays, we prefer Spack/Stack, but it takes experience to install it. |
@SmithJos13 If possible use INTEL > 20. As @hga007 mentioned we had similar problems using intel 19.x.x.x versions on certain HPC clusters, both during compilation and run time. |
@SmithJos13 You might also want to check the following web page., https://oceanmodeling.github.io/ufs-coastal-app/versions/main/html/porting.html It mainly explains to install spack-stack and use it to run UFS coastal. It would be a good starting point to port the model to your platform. If you try and has some issue, let me know. |
Hi @SmithJos13 Would you please update here where you are at with the ROMS-CICE coupling? Thanks, |
Hi Saeed, Joey is away this week. So I will try to answer. We have not had success building the nuopc coupling layer on our local compute server. The rather dated OS on this server precludes installation of proper versions of some prerequisites for NUOPC. We are currently splitting the server in two so that the OS can be upgraded (this week) on one of it's boxes. Then current versions of all the dependencies can be installed. We expect to be able to use the suggested spack-stack approach to get everything built. Our computer support staff have already looked into that portion of the process and built components successfully. We've asked about access to NOAA resources where NUOPC is already built to speed this process, but have not heard back yet. Running the coupled ROMS-CICE for the Bering Sea application will very likely require more resources than the single box of our compute server can handle, so I think getting that access will be important either way. We have a standalone version of CICE 'fully optimized' for Bering Sea application and it is producing quite good results (when forced from above with ERA5 reanlysis, and below with ROMS Bering sea model output). I suspect that running a coupled ROMS-CICE using the MCT coupler in COAWST would be straightforward as we have COAWST, with that old style of coupling already up and running on a NASA machine that we have access to. But I understand that NUOPC is a central piece of what we're trying to test here. I hope this helps. Please let me know if you have any other questions. Scott |
Hi Scott, Thanks for the update. I have requested your access to HPC and followed up on that several times. I will try again today. Thanks, |
Hi @uturuncoglu, Our system has been finally upgraded here at osu which means that I have been able to try installing the ufs model dependences with docker + spack-stack. I've followed the directions as they are laid out in the link that you sent (https://oceanmodeling.github.io/ufs-coastal-app/versions/main/html/porting.html) however these seem to be building the ufs model with gcc rather than intel compilers I was under the impression that we wanted to build the ufs-model with intel compilers (maybe I misunderstood). Will it be able to adapt this method to intel compilers if we want to? or is there already instructions out for that? Anyway, I have followed the directions as they are laid out and I was unsuccessful in building the model. I've rain into an issue with building the dependencies see the following two screen shots for the exact error: Do you have any idea what might be the issue? also I think the following line in the directions that might have a typo. In section 3 the note that is talking about the proj error and how to fix it there is a line that reads git cherry-pick upsream 149d194 (and resolve conflicts) I think that it should be git cherry-pick 149d194 (and resolve conflicts) Terminal didn't seem to happy about there being upsream there. Thanks, |
Hi @SmithJos13 Joseph, for the argument mismatch in gcc/gfortan version >=10 you need to pass the flags: "-fallow-argument-mismatch -fallow-invalid-boz" to the compiler, until the offending fortran code is fixed. I don't know why you need to build mapl anyway. |
Thanks for the quick reply! Yeah I'm not sure about the either I think that it might a dependency for a dependency or something like that. So is there a particular place in spack that I need to feed this argument? |
@SmithJos13 I went quickly through the spack-stack code tree and found other packages had the same issue. You might need to create a patch for mapl. If you look at spack/var/spack/repos/builtin/packages/parallelio/gfortran.patch you might get a clue how to incorporate the patch. I cannot help more, I haven't work in spack/spack-stack yet and I don't know much details. @uturuncoglu for sure can help more. |
I always work with ifort. For ROMS-CICE coupling, we need to enhance the export/import states to include sea ice fields exchange in the ROMS standalone NUOPC cap module for the UFS ( |
@hga007 Hernan hi, I was trying to build ROMS and other models in UFS-Coastal and I encountered the following error (name conflict in libraries):
I don't know if you have seen this before, but it might be a good idea to rename "SUBROUTINE abort" in ./Utility/abort.F to something like roms_abort to avoid conflicts? |
@pvelissariou1 So I was looking at the file that you pointed me towards and seems like some of the flags that you have listed there are already incorporated in that file. I've tried adding " -fallow-invalid-boz" to the other flags already and I'm still running into the same issue. Is there some particular modification that I need to make to this file to get it to recognize these flags (I know that you just said that you haven't worked with spack-stack before so maybe @uturuncoglu can shine some light on this). |
@SmithJos13 First of all, I did not tested custom spack-stack installation with Intel compiler yet. I have a version of Dockerfile that creates the Docker image and able to run RTs but this is again with GNU. I could push that file to ufs-coastal-app repo and you could see the commands that I am using but installing to exiting system could have little bit changes. If I remember correctly, I also have custom spack-stack installation that works on Orion. But again, it is hard to automatize everything and make it work in every custom platforms. It will be always have system issue. I could also try to extract the information from Dockerfile and create a new dependency installation script as much as possible but I need to work on it. |
@SmithJos13 BTW, space-stack 1.5.1 is working for me (this will have ESMF 8.5.0). They are plaining to release also 1.6.0 to fix some issues but I am not sure its timeline. |
For us, spack-stack 1.5.1 with JEDI and UFS support works well in my Linux Box using intel 2021.8.0 and ESMF 8.5.0. In our JEDI meeting today, they started talking about spack-stack 1.6.0. Dave Robertson is working to install 1.5.1 on our other computers, which is not trivial, and he has lots of experience. I hope that spack-stack will be more stable in the future and require less frequent updates. |
@hga007 Maybe I could try to create script to automatize the installation. Then you could try in your side with help of Dave Robertson. If we need some change then we could incorporate it and update. Do you think it can be done? I know you might always manual intervention to the script for specific GNU or Intel version but we could do our best. |
Thanks @uturuncoglu @hga007 I suppose that I'll try work on installing the 1.5.1 version of the spack-stack. I'm a little new to using github... So if I wanted to do that, do I just have to check out the 1.5.1 version of spack-stack and then run "git submodule update --init --recursive" then proceed with the install instructions? |
@SmithJos13 Yes. Let me know if you need help. |
@SmithJos13 The installation processes in the https://oceanmodeling.github.io/ufs-coastal-app/versions/main/html/porting.html is not up-to-date and has couple of minor typos. So, please stick to Docker file way. In that case I am using lmod. |
@uturuncoglu Okay I wont look at the instruction in that link. Neglecting the issue tcl vs lmod I think I have installed the dependencies using spack-stack properly. Running the following command
which appears the be the a complete list of all the dependencies that I need in order to start porting the UFS model. The following command
which is successful. Then I'm trying to load the module files using environment modules [version:
which hasn't changed since I installed environment modules in docker. I'm struggling with figuring out how to load in the modulefile that I have created using spack so that I can use environment modules inside docker. Is there further steps I need to take the load the modules in? Do I need to use spack to load in the modules? Further is there any special instruction that I need to follow in order to port the UFS module to my local linux box? or should I be able to follow the steps as listed the following site: https://ufs-weather-model.readthedocs.io/en/latest/BuildingAndRunning.html Thanks for all your help! |
@SmithJos13 That is great. It is really great progress. Since this is a custom machine you need to create set of module files. Are you plaining to run regression tests too. If so, you also need to make couple of changes to run them. There are some commits that is did not next AMS short course related with land component in the following link to run RTs under Docker container, https://github.com/uturuncoglu/ufs-weather-model/commits/feature/ams_course/ Please look at commits starting with Dec. 5. They could give some guideline to you. If you could not make it. Then, we could have a call and try together. |
@rjdave I was looking to your instructions and I realized that you are not using system provided MPI. Am I wrong? It seems that you are building openmpi with the Intel compiler (system provided module). If so, these instruction will also need to include additional step to add system provided MPI to the spack externals YAML file. Anyway, I am very close to finalize the script for dependency installation. Once I tested with the exiting platforms, I'll let you know. |
Dave lost electricity at his house because of yesterday's storm. So, he doesn't have internet. Yes, we use the OpenMPI versions on our computers to be consistent with all the other libraries that depend on it. When updating Spack-Stack, I believe he first tried to update the needed modules. We did that recently with atlas 0.35.0. I assume that he does the same with OpenMPI. Please remember that the MPI libraries are usually tuned to the particular architecture. Therefore, we shouldn't build from scratch. |
So does everything looks like it was successful from the information that I provided? Also do you have any example of how these module files were created for other environments that I could work from? If there is any resources that you recommend that I look for doing this I'd appreciate it. I think once these module files are built then I can proceed with trying to install the UFS model!! I'm a fairly novice unix user so I've been learning a lot going through this process (hence all the questions). Thanks! |
@SmithJos13 Here is an example that I am using under Docker container. https://github.com/uturuncoglu/ufs-weather-model/blob/feature/ams_course/modulefiles/ufs_local.gnu.lua You could create a new one for your system. In my case, I am running regression test under Docker (it has slurm and also spack-stack installation) like following,
Of course, this is for different configuration not related with coastal app and it also uses You might able to compile one of the coastal specific configuration like following (if you name your module file like ./compile.sh "ufs_local " "-DAPP=CSTLF -DCOORDINATE_TYPE=SPHERICAL -DWET_DRY=ON" coastal gnu NO NO @pvelissariou1 @saeed-moghimi-noaa I am not sure but we might arise an issue to ufs-weather-model developers to have more flexible RT system that allows to bring new machines. Anyway, we could discuss it more in our next meeting. |
@uturuncoglu Thanks for the information. Then I sourced the following
So environment modules is able recognize the module files supplied by the ufs-weather-model. Then trying
I suspect the issue is that I need to adapt the module file a little more for my machine, but I'm not really sure what needs to be modified in |
Since I have only been building spack-stack on single node machines I have been allowing spack-stack to compile Open MPI. I am currently working on getting spack-stack setup on our university cluster so I will experiment with the necessary steps to use system provided MPI. |
@pvelissariou1 @janahaddad @saeed-moghimi-noaa i am planning to move this to ROMS repo since it is related with ROMS. I'll also open a new ticket for ROMS CICE coupling in general. |
@uturuncoglu Yes, it is a good idea to separate all these. If we keep "ROMS-CICE coupling for Bering Sea" then this issue/discussion should be applied to this specific project only. I suggest to populate "Implementation of ROMS+CICE coupling" instead. We do have the same issue with some of our github issues where they are polluted with unrelated discussions. @janahaddad and I already talked about this, let's see what will be the best strategy to address this. |
Okay. I transferred this to ROMS repo. |
Hi @uturuncoglu I was wondering where the progress was in developing a configuration for the ufs-coastal model. I have been able download and build ufs-coastal on the Orion but I don't see an option to build CICE+ROMS+CDEPS coupled only. I was wondering if there is a hidden option or command that I need to use? Thanks, |
@SmithJos13 Unfortunately, there is no a |
@pvelissariou1 I think that there might be a misunderstanding here. I was under the impression that CICE+ROMS+CDEPS has been built/implemented but it hadn't been tested. I'm not really looking for a test case instead I'd like to build the model in this configuration and start testing it in the Bering sea. When I have attempted this in the past there is not preconfigured app that will allow me to build the model in this configuration. If the model is not ready to be built like this I'd be interested to know when I would be able to do so? |
@SmithJos13 Hi Joseph, I don't have an exact answer to your question. My understanding is that at this point there is no implementation of CICE+ROMS+CDEPS in ufs-coastal on the application level, and there is not a timeline set to do this. I know Hernan was working to configure a test case for ROMS+CICE (internal) on Great Lakes but I don't have any more info than that. @uturuncoglu will be back from his vacation at the end of this month to chime in. @saeed-moghimi-noaa , @janahaddad it might be necessary to have a discussion on ROMS+CICE+CDEPS implementation in ufs-coastal. |
Thanks @pvelissariou1. Okay that is unfortunate to hear about the CICE+ROMS+CDEPS coupling. I guess I will follow up when @uturuncoglu returns from vacation. I have another question @pvelissariou1: Is there applications where I can build a CICE+ocn+CDEPS(DATM) configuration? |
@SmithJos13 Currently CICE is not configured in ufs-coastal although the model itself is included in ufs-coastal. Somehow we had some urgent priorities with the ocean components and the cice implementation was pushed back. I don't know how difficult will be to configure and use ROMS(+internal CICE)+CDEPS, it has been quite a few years that I used ROMS. I believe, that this configuration can be done in ufs-coastal. For example, we do have configurations of SCHISM(+internal WAVES) forced by CDEPS atm. data in ufs-coastal. |
@pvelissariou1 Thanks for adding some clarification about what is going on on your end. The idea of ROMS(+internal CICE)+CDEPS sounds like the next best option for coupling in the way that I want to (depending on where the current development status is). I thought that ROMS+CDEPS was already configured in ufs-coastal? is there some tweaks that would need to be made to the NUOPC cap to get ROMS(+internal CICE)+CDEPS working? (I'm not an expert in NUOPC coupling so maybe there is a major thing that I am overlooking here). |
@SmithJos13: ROMS is already configured for CDEPS and CMEPS within ufs-coastal. The issue is that CICE requires importing data from atmosphere and ocean components since, as far as I know, there is a lack of infrastructure to provide such fields besides coupling. To make things easier, you need congruent grids between ROMS and CICE. Otherwise, you will have to introduce data from another ocean model to provide CICE data in the points not covered by ROMS. Even if we have NUOPC modules for all the compiling components, we must have a coupling strategy between the different grids and use the DATA component (CDEPS) if required. You could start simple for your desired application:
You need the CICE grid for your application for all these. I have never built one. Coupling is not trivial because it requires some knowledge of all components. Thus, we recommend starting from simple to complex. |
@hga007 @SmithJos13 JFYI, we have some capability to filled unmapped regions with CMEPS mediator. Please check the following presentation - https://docs.google.com/presentation/d/1Tk76zlsRiT7_KMJiZsEJHNvlHciZJJBBhsJurRMxVy4/edit#slide=id.g2613ed2f8f8_0_0 This is initially used by HAFS application but it could be easily extent to ice coupling. Anyway, we need to discuss it with the Coastal team since it seems that workflow integration has priority at this point. @janahaddad maybe this could be a discussion item once I return next week. |
I would like some clarification about development progress on CICE development/implementation so that we are all on the same page here:
I understand things are extremely busy and other aspects of the project are more pressing and I appreciate any answers people are able to provide on this |
@SmithJos13 I'll try to answers as much as I can in an order,
I hope it helps. |
As discussed today we'll close this issue and continue ROMS+CICE progress on #4 |
@SmithJos13! , @sdurski , @kurapov-noaa @uturuncoglu @pvelissariou1 @hga007
Description
OSU and OCS is working together towards coupling CICE and ROMS using UFS-Coastal code base
Required configurations
The text was updated successfully, but these errors were encountered: