GFDL_atmos_cubed_sphere icon indicating copy to clipboard operation
GFDL_atmos_cubed_sphere copied to clipboard

Build dev/emc branch with no physics

Open climbfuji opened this issue 2 years ago • 7 comments

Description

JEDI needs a version of the dycore that matches what the UFS uses but doesn't use non-linear physics and ideally has no dependencies except fms (and its dependencies).

This draft PR - currently work in progress - introduces a NO_PHYS cmake option for the dev/emc branch to achieve that. It builds on the existing preprocessor macros (ok, it makes it a little more complicated, but not a lot) and it uses some logic that exists in dev/gfdl (e.g. the local definition of cappa, dtdt_m and dp1 in fv_dynamics.F90).

Fixes # no issue has been created yet

How Has This Been Tested?

  • I built GFDL_atmos_cubed_sphere standalone, using cmake -DCMAKE_BUILD_TYPE=Debug -DDEBUG=ON -DNO_PHYS=ON -DGFS_PHYS=OFF -DGFS_TYPES=OFF -DOPENMP=ON -DUSE_GFSL63=ON with just [email protected] (and its dependencies, including netcdf-fortran/netcdf-c) being loaded. Not sure which tests I can/should run.
  • I ran the ufs-weather-model regression tests on Hera with Intel for the control test (GFS v16, using GFDL microphysics through CCPP), and it passed against the existing baseline.
  • I also ran the ufs-weather-model regression tests on Cheyenne with Intel 19 for the control_c48 test (GFS v16, using GFDL microphysics through CCPP), and it passed against the existing baseline. Remember that lots of these complicated #ifdef statements were needed because of a bug in older versions of the Intel compiler w.r.t. defining pointers in OpenMP pragmas or not (Intel fixed this in its later versions 2021.2.0+ of the oneAPI compilers, which is used on Hera).
  • @danholdaway is going to test this branch without physics in JEDI.

Thoughts:

  • Seeing that there are now three different options for physics (none, GFS, GFDL), the old idea of creating a generic API for physics in the dycore with different backends is coming back to me. @bensonr What are your thoughts?
  • At some point, we should give up on supporting older versions of the Intel compiler. Then we can remove a lot of the complicated #ifdef statements. But when can we reasonably request that everyone uses [email protected] or later?

Checklist:

Please check all whether they apply or not

  • [x] My code follows the style guidelines of this project
  • [x] I have performed a self-review of my own code
  • [ ] I have commented my code, particularly in hard-to-understand areas
  • [ ] I have made corresponding changes to the documentation
  • [x] My changes generate no new warnings
  • [x] Any dependent changes have been merged and published in downstream modules (there are none)

climbfuji avatar Dec 06 '22 16:12 climbfuji

Hi, Dom. Thank you for filing this request.

Could we have some more information about why this change is necessary? While there is use for idealized test cases (especially for development and regression testing) they tend to be a distraction. Often the idealized tests become an end onto themselves, and we get a lot of attention and effort on these that have little benefit to real weather or climate simulation.

lharris4 avatar Dec 08 '22 14:12 lharris4

Thanks for considering this @lharris4. Within the fv3-jedi system we have a need to create the cubed sphere grid and use the remapping tools from FV3 for training the background error model. This introduces a dependency on FV3. We would like to a have a 'skinny' build so that we can easily run things in CI and provide a simple environment for users running data assimilation applications that do not depend on the entire forecast model. Once the dependency on CCPP came into FV3 it made the build system much heavier for these use cases so we wound up forking and living with an old version of FV3 without CCPP as a dependency and this has led to different problems.

danholdaway avatar Dec 08 '22 14:12 danholdaway

@danholdaway - as the term FV3 has been significantly blurred, can you clarify if you are referring to the dycore or the atmospheric system. Additionally, what are the grid generation and remapping tools from FV3?

bensonr avatar Dec 08 '22 16:12 bensonr

Hi, @danholdaway Thank you for your explanation. You are doing really neat work with the FV3 JEDI system.

Given that FV3 (a dynamical core and not a model) supports a number of models that do not use CCPP, perhaps a simpler way would be to extend use of the existing -DGFS_PHYS flag, so that only when it is defined do we compile in any of the CCPP-related code. We could even replace the -DGFS_PHYS flag with a -DCCPP flag.

Also, if the grid generation and vertical remapping are the only things that fv3-jedi need, it may be preferable to pull out the source files and create wrappers for them. These codes are stable and have been changed little in the last five years, the biggest change being the introduction of the multiple and telescoping grids.

Thanks, Lucas

lharris4 avatar Dec 08 '22 16:12 lharris4

Also: We already have a solo_core functionality in FV3, allowing FV3 to be run without coupling to a comprehensive physics package. Since this uses the same driver interface as does SHiELD and UFSatm (and AM4) it should be easily slottable into the existing NUOPC cap, without needing to add new code or to re-write this functionality. The solo_core includes simple physics, but these can be easily disabled at runtime so the dynamics runs entirely adiabatic.

lharris4 avatar Dec 08 '22 16:12 lharris4

@lharris4 @bensonr I am really interested in this discussion. I mentioned earlier that a while ago @bensonr and I spoke about the idea of a generic backend for physics in the FV3 dycore (when I refer to FV3 dycore I mean the GFDL_atmos_cubed_sphere code) with choices for GFDL physics, CCPP physics, no/stub physics behind. I think that this would be a lot cleaner and clearer. Do you think that would make the dev/gfdl and dev/emc branches similar enough to unify them? I'd be willing to work with you on that if that's of interest to you.

climbfuji avatar Dec 08 '22 16:12 climbfuji

@danholdaway - as the term FV3 has been significantly blurred, can you clarify if you are referring to the dycore or the atmospheric system. Additionally, what are the grid generation and remapping tools from FV3?

@bensonr I was referring to the dycore (i.e. GFDL_atmos_cubed_sphere). From fv3-jedi we call fv_init to initialize the cubed sphere grid (using also the associated derived types). We also call a host of things from fv_grid_utils_mod, fv_arrays_mod, fv_grid_utils_mod and external_ic_mod in order to convert cold starts to warm starts using the remapping.

danholdaway avatar Dec 09 '22 02:12 danholdaway

I pushed a coupled of small additional changes that we need in order to get fv3-jedi running with this version of FV3. Firstly we need to publicize the fv3 source flag in external_ic. This is because fv3-jedi calls the remapping schemes and this needs to be set before calling them. We can't go through the parent routines for those tools in our case.

Secondly we have an issue when we run regional and read the grid mosaic files. We can't have files in INPUT/file.nc because we can't distinguish between the INPUT directory associated with the atmosphere and the once associated with the ocean when running coupled. Instead we want to allow for longer more explicit paths, e.g. /path/that/i/choose/INPUT/grid_spec.nc. It shouldn't have any impact on folks running in regular EMC/GFLD modes.

Thanks again for considering these changes that we need to eliminate our dependency on forks of legacy gfdl code.

danholdaway avatar Jan 06 '23 22:01 danholdaway

@danholdaway I pulled the latest dev/emc into this branch resolved the merge conflicts (the file moving_nest/bounding_box.F90 was deleted in dev/emc).

climbfuji avatar Jan 07 '23 00:01 climbfuji

@bensonr @lharris4 Based on yesterday's conversation, I am closing this PR. We agreed that we would use my branch as a temporary solution and add it to the JCSDA GFDL_atmos_cubed_sphere fork, so that we don't end up using code in a personal fork.

In the meanwhile, we'll be working on incorporating the generic physics interface that your team has been working on for GFDL_atmos_cubed_sphere main into the dev/emc branch, which should allow us to achieve the same as in this PR (that is, compile without CCPP physics and its dependencies).

Please correct me if this is not a correct summary of our conversation.

climbfuji avatar Feb 23 '23 00:02 climbfuji

Hi, Dom. This is correct. I think this is the cleanest solution moving forward. We also discussed cleaning up the #ifdefs especially GFS_PHYS, and replacing them with more generic ones (AM4, CCPP, SOLO).

I will talk to Linjiong about the driver, and then report back to you.

Thanks, Lucas

On Wed, Feb 22, 2023 at 7:23 PM Dom Heinzeller @.***> wrote:

@bensonr https://github.com/bensonr @lharris4 https://github.com/lharris4 Based on yesterday's conversation, I am closing this PR. We agreed that we would use my branch as a temporary solution and add it to the JCSDA GFDL_atmos_cubed_sphere fork, so that we don't end up using code in a personal fork.

In the meanwhile, we'll be working on incorporating the generic physics interface that your team has been working on for GFDL_atmos_cubed_sphere main into the dev/emc branch, which should allow us to achieve the same as in this PR (that is, compile without CCPP physics and its dependencies).

Please correct me if this is not a correct summary of our conversation.

— Reply to this email directly, view it on GitHub https://github.com/NOAA-GFDL/GFDL_atmos_cubed_sphere/pull/231#issuecomment-1441047312, or unsubscribe https://github.com/notifications/unsubscribe-auth/AMUQRVCH2SGHKUJGGNYH3KDWY2UZNANCNFSM6AAAAAASVWGZRI . You are receiving this because you were mentioned.Message ID: @.***>

lharris4 avatar Feb 23 '23 14:02 lharris4

Hi, all. Linjiong reminded me that a driver for the "intermediate physics", specialized for FV3 Integrated Physics, was part of the latest FV3 public release:

https://github.com/NOAA-GFDL/GFDL_atmos_cubed_sphere/blob/main/model/intermediate_phys.F90

A similar one can be built for CCPP, using appropriate ifdefs to ensure that a non-CCPP model doesn't try to compile it in. Please let us know if you need any help or advice building this.

Thanks, Lucas

On Thu, Feb 23, 2023 at 9:24 AM Lucas Harris - NOAA Federal < @.***> wrote:

Hi, Dom. This is correct. I think this is the cleanest solution moving forward. We also discussed cleaning up the #ifdefs especially GFS_PHYS, and replacing them with more generic ones (AM4, CCPP, SOLO).

I will talk to Linjiong about the driver, and then report back to you.

Thanks, Lucas

On Wed, Feb 22, 2023 at 7:23 PM Dom Heinzeller @.***> wrote:

@bensonr https://github.com/bensonr @lharris4 https://github.com/lharris4 Based on yesterday's conversation, I am closing this PR. We agreed that we would use my branch as a temporary solution and add it to the JCSDA GFDL_atmos_cubed_sphere fork, so that we don't end up using code in a personal fork.

In the meanwhile, we'll be working on incorporating the generic physics interface that your team has been working on for GFDL_atmos_cubed_sphere main into the dev/emc branch, which should allow us to achieve the same as in this PR (that is, compile without CCPP physics and its dependencies).

Please correct me if this is not a correct summary of our conversation.

— Reply to this email directly, view it on GitHub https://github.com/NOAA-GFDL/GFDL_atmos_cubed_sphere/pull/231#issuecomment-1441047312, or unsubscribe https://github.com/notifications/unsubscribe-auth/AMUQRVCH2SGHKUJGGNYH3KDWY2UZNANCNFSM6AAAAAASVWGZRI . You are receiving this because you were mentioned.Message ID: @.***>

lharris4 avatar Feb 23 '23 14:02 lharris4