-
Notifications
You must be signed in to change notification settings - Fork 171
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
CRTM versions used in the global-workflow #1453
Comments
Yes, @emilyhcliu , you are right. When we clone and build
where We need to update modulefiles in |
@emilyhcliu The workflow level defines/forces the module/library versions (via the Would be good to get the The global-workflow issue #1356 will be documenting the updates for the upcoming ops GSI and obsproc upgrade and will be looking for a new tag from the GSI for that. Other than that, this seems like this should be a GSI issue and not a global-workflow issue. If you agree, please close this and open an issue in the GSI repo to resolve. Thanks! |
@RussTreadon-NOAA I am going to change the modulefiles in release/gfsda.v16 from hpc-stack to hpc-stack-gfsv16 since hpc-stack-gfsv16 has the most updated CRTM-2.4.0_emc (from CRTM official site). |
@Hang-Lei-NOAA @KateFriedman-NOAA @RussTreadon-NOAA Q: @Hang-Lei updated the CRTM-2.4.0_emc (with the JCSDA official site) on April 10 for hpc-stack-gfsv16. So, I am going to update the release/gfsda.v16 modulefiles to use the hpc-stack-gfsv16. However, I am not sure if the other packages installed in hpc-stack-gfsv16 is good. Any comments? |
@emilyhcliu As I previously introduced in the email, the /scratch2/NCEPDEV/nwprod/hpc-stack/libs/hpc-stack/ was PASSed to EPIC for continue installation to support the community. Itself was frozen or not used for service. Therefore, any old usage on that could transfer to the EPIC installations. All libs information is posted on wiki https://github.com/NOAA-EMC/hpc-stack/wiki. /scratch2/NCEPDEV/nwprod/hpc-stack/libs/hpc-stack-gfsv16/ was originally installed for gfs16 only. It used the same installation procedure to the above one, but only for gfs16 required versions. It is continuously maintained by EMC. It should be fine. |
I just had a discussion with @RussTreadon-NOAA about hpc stacks on the HPC machines. The GSI develop version points to hpc-stack. The global-workflow/modulefiles points to hpc-stack-gfsv16. |
I am not familiar with the stack management. Is your q. related to running the |
@emilyhcliu The current hpc-stack installs you're referring to are being moved away from. We will be using the following stacks moving forward:
The current hpc-stack installs will no longer be used soon and we shouldn't spend time updating them. The GSI needs to update to use the EPIC hpc-stack intel 2022 installs on supported platforms for now. When spack-stack is ready (being tested now) we will all move to those stacks. Note: the above is just for the R&D platforms, WCOSS2 still only has the one production hpc-stack installation and is not yet EPIC or spack-stack. Hope the above info helps! |
@Emily Liu - NOAA Federal ***@***.***>
(This EMC maintained hpc-stack installation is Frozen, and had been passed
to EPIC for further maintenance)
The hpc-stack-gfsv16 for the ops GFSv16 system. It is still actively
maintained by EMC.
All about library can be found on wiki:
https://github.com/NOAA-EMC/hpc-stack/wiki
…On Thu, Apr 20, 2023 at 4:57 PM Kate Friedman ***@***.***> wrote:
@emilyhcliu <https://github.com/emilyhcliu> The current hpc-stack
installs you're referring to are being moved away from. We will be using
the following stacks moving forward:
1. The hpc-stack-gfsv16 for the ops GFSv16 system (dev/gfs.v16
branch). If an update is needed to this hpc-stack install to support the
GFSv16 then it should be done in these installs. I believe requests for
that still go to @Hang-Lei-NOAA <https://github.com/Hang-Lei-NOAA> and
the https://github.com/NOAA-EMC/hpc-stack repo.
2. EPIC-maintained hpc-stacks for GFSv17 development system (develop
branch) and then eventually spack-stack in the near future. See more info
on the move to the EPIC hpc-stacks in #1311
<#1311> and related
component issues (e.g. ufs-community/ufs-weather-model#1465
<ufs-community/ufs-weather-model#1465>).
Requests for changes to these EPC hpc-stack installs also go through
https://github.com/NOAA-EMC/hpc-stack.
The current hpc-stack installs will no longer be used soon and we
shouldn't spend time updating them. The GSI needs to update to use the EPIC
hpc-stack intel 2022 installs on supported platforms for now. When
spack-stack is ready (being tested now) we will all move to those stacks.
Note: the above is just for the R&D platforms, WCOSS2 still only has the
one production hpc-stack installation and is not yet EPIC or spack-stack.
Hope the above info helps!
—
Reply to this email directly, view it on GitHub
<#1453 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AKWSMFEZJIBOEVB6VD3E2PDXCGPJVANCNFSM6AAAAAAW3BFCCU>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
|
@KateFriedman-NOAA Thanks. This is helpful. |
@KateFriedman-NOAA @Hang-Lei-NOAA @aerorahul @KateFriedman-NOAA Your explanation is helpful, clear and thorough. I know how to modify GSI develop and the current release/gfsda.v16 to be consistent with the hpc stack maintenance. @Hang-Lei-NOAA Thanks for helping me updating the module files on HERA. |
@RussTreadon-NOAA and @CatherineThomas-NOAA What do you think? |
Kate states
GSI branch
This aligns with EIB & EPIC stack management, right? |
@KateFriedman-NOAA One question: Are the EPIC-maintained hpc-stacks still under development or ready for users? |
@emilyhcliu The EPIC-maintained hpc-stacks are ready for use on the R&Ds. If you find any issues in them or need to request changes/additions to the stacks then you will open an issue in the https://github.com/NOAA-EMC/hpc-stack repo. The EPIC folks will take care of fulfilling the request. Thus far they have been very helpful and responsive! The spack-stack installs are still under development for use by the GFS/global-workflow. |
@KateFriedman-NOAA Got it! Thanks for your confirmation. |
@RussTreadon-NOAA @CatherineThomas-NOAA |
I thought we already ran some preliminary GFS v17 DA tests. Did these tests use EPIC stacks? I don't know. |
@RussTreadon-NOAA This seems correct to me! :) @RussTreadon-NOAA @emilyhcliu Related to all of this...one snag you are likely to hit is there aren't any intel 2018 versions of EPIC hpc-stacks on Hera or Orion. I believe there is an intel 2018 on Jet that @DavidHuber-NOAA used for the Jet port we just wrapped up (see his GSI Jet port work for how to move to EPIC hpc-stack). Not sure where the GSI is with moving to intel 2022 or if it's possible to get EPIC-maintained intel 2018 hpc-stacks on Hera/Orion. You could ask them. I'm currently testing the full GFS with EPIC hpc-stacks but have left the GSI (gsi_enkf.fd) as-is while loading intel 2022 at runtime. It's working thus far and looks like we could move global-workflow to intel 2022 regardless of the GSI status on intel version. I will be putting an update in issue #1311 soon. I am hoping to wrap up this work before I go on leave. Wanted to let you know of this parallel effort. FYI, this ufs-weather-model issue ufs-community/ufs-weather-model#1465 has one of the best lists of available EPIC hpc-stacks and how to load them in your system. Wanted to point that out again. |
@RussTreadon-NOAA I am not aware of test done with the EPIC stacks on your side. @CatherineThomas-NOAA Do you know any? |
@emilyhcliu @RussTreadon-NOAA I don't know of any DA tests intentionally testing the EPIC maintained stacks. |
Thanks @CatherineThomas-NOAA and @emilyhcliu for sharing where we are and where we are headed. |
@RussTreadon-NOAA and @KateFriedman-NOAA |
@emilyhcliu That's my understanding. The GSI will successfully compile with intel 2022 but there are issues at runtime after having been built with intel 2022. Other GSI folks should reconfirm that issue, I only understand it from the workflow side of things. This issue is a current blocker for at least one major global-workflow task = adding version files into the v17 system like was done for the v16 system and aligning all components to the same library versions via the version files from the workflow level. |
@emilyhcliu Yes, this is still the case. Regression tests all fail with Intel 2021+ when compiled with |
@KateFriedman-NOAA and @DavidHuber-NOAA Thanks for your explanations ! |
@emilyhcliu can this be closed or is there still an issue to resolve? Thanks! |
OBE |
Description
I spotted the following in the gfs.v16.3.5 global-workflow related to CRTM.
When building the global-workflow. I checked the GSI build log (build_gsi.log)
crtm/2.4.0 is loaded
Using crtm from the following hpc-stack
Compiler is using the module files under the following directory:
However, in the global-workflow modulefiles directory:
module_base.hera.lua
points to the following hpc stack:The GSI log file (gdasanal.log) from the parallel experiment output has the following information about CRTM
It seems that we are using two different stacks: one is
hpc-stack
and the other ishpc-stack-gfsv16
in the same global-workflow.Checking the CRTM time stamp of the two stacks:
hpc-stack-gfsv16:
hpc-stack:
I checked with EIB and confirmed that the CRTM source code for hpc-stack-gfsv16 is the following:
The CRTM_LifeCycle.f90 from the /scratch2/NCEPDEV/nwprod/hpc-stack/src/develop/pkg/crtm-v2.4.0, has the following print statement commented out:
Please notice that the repeated print statments from lines 606-608 that we saw in the GSI output log file are from lines 606-608.
These print statments were commented out for hpc-stack-gfsv16
However, we still see the repeated print statments from lines 606-608 in our GSI log file after we recompiled the global-workflow with the updated hpc-stack-gfsv16:
This is the GSI log file after we re-compiled the global-workflow with the updated hpc-stack-gfsv16 on April 10, 2023.
We should not see the print statments in the log file since they were commented out in the source code.
So, I am wondering if this is because GSI in the global-workflow is pointed to the hpc-stack (updated on June 2022), not the hpc-stack-gfsv16 (the updated on April 10, 2023).
Question: In global-workflow, when the modules defined in each component (e.g. GSI, fv3gfs, post, ....etc) are different from the ones defined in the global-workflow modulefiles, the latter ones will be used to build the components. Am I correct?
Requirements
Acceptance Criteria (Definition of Done)
Dependencies
The text was updated successfully, but these errors were encountered: