Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ultra low resolution configuration for testing #2508

Open
danholdaway opened this issue Nov 21, 2024 · 22 comments
Open

Ultra low resolution configuration for testing #2508

danholdaway opened this issue Nov 21, 2024 · 22 comments
Labels
enhancement New feature or request

Comments

@danholdaway
Copy link

Description

In order to test changes to the global-workflow system we have to run the cycled system. Currently this is done with a C48 atmosphere and 127 level / 5 degree ocean model configuration. However this is not fast enough for performing the testing and is limiting the number of pull requests that can be merged in global-workflow. Additionally the ensemble and deterministic forecasts are both run using C48, meaning we are not replicating the dual resolution setup that is used in production.

Note that the configuration does not have to produce anything scientifically sound, just be able to reliably run in order to test the connection between tasks of the workflow.

Solution

  • Develop a C12 and 32 level atmosphere with 10 degree ocean configuration for the ensemble.
  • Develop a C18 and 32 level atmosphere with 5 degree ocean configuration for the deterministic. If C18 is not a permissible configuration in FV3 then C24.

Note that the ocean can already by run at 5 degrees.

The data assimilation group at EMC will work on the enabling these configurations with GSI and JEDI.

@danholdaway danholdaway added the enhancement New feature or request label Nov 21, 2024
@JessicaMeixner-NOAA
Copy link
Collaborator

@danholdaway - Please let me know if ultra-low wave configurations are also required and I'm happy to help provide those parts. (Also fine to turn off waves for various tests too).

@danholdaway
Copy link
Author

Thank you @JessicaMeixner-NOAA I think it would be useful to include waves in this effort. We aren't currently testing with waves turned on but we should start to include that.

@junwang-noaa
Copy link
Collaborator

@danholdaway Our team will look into atm/ocn/ice components for the test you requested. May I ask some background questions:

  1. For atmosphere, is the choice of C12/C18/C24 (800km/500km/400km resolution) coming from the running speed only or are there some other reasons? Do you have test cases with 32 vertical levels for us as a configuration reference? I am thinking that we may have to use a different atm physics package at this coarse resolution from those used for GFS/GEFS/SFS, and the dycore needs to be hydrostatic, which is different from GFS/GEFS.
  2. For ocean, do you know if there is 10 degree super grid available? I assume you are OK with 5 degree ocean/ice if we can't find a 10 degree super grid
  3. How long do you expect these tests to run? I assume these tests only need to run 15hrs with IAU. There might be some coupling configuration change if longer running time is expected. Thanks

@danholdaway
Copy link
Author

Thanks for looking into this @junwang-noaa, we very much appreciate the help.

  1. I had a quick play around with JEDI and found that I couldn't create a C18 geometry with FV3 so that might be out of the question. The choice of C12/C24 is driven by running speed. For most of the PRs we are creating we don't want to test what the model does but rather test the connections between the tasks of the workflow. We need the model to run from one cycle to the next and output files for the DA to pick up. If we change the model configuration, e.g. physics or non-hydro -> hydro it should be fine though we might need to be aware that we might be reading less from the files, e.g. no delz to pick up. We don't have any configuration for 32 levels yet. We thought about starting with the 127 level file and taking ~every 4th level. Do you think that would work?

  2. @guillaumevernieres do you know if there exist any 10degree grids for MOM6? I think you mentioned you'd testing something at that resolution in SOCA.

  3. We don't need long forecasts with the coupled model. Just long enough to cycle. I think that's just 12h with IAU.

@guillaumevernieres
Copy link

@danholdaway , we do have a 10deg grid for mom6. Something would need to be done for cice6 as well.

@DeniseWorthen
Copy link
Collaborator

We create the CICE6 fix files from the MOM6 supergrid and mask. We'd need a MOM_input consistent w/ a 10deg MOM6 physics.

@junwang-noaa
Copy link
Collaborator

@danholdaway Thanks for the information.
@guillaumevernieres Would you please share the 10deg mom6 super grid and run configuration with us?

@guillaumevernieres
Copy link

@danholdaway Thanks for the information. @guillaumevernieres Would you please share the 10deg mom6 super grid and run configuration with us?

Let me see what we have. We did that work ages ago and I don't remember if we went beyond being able to use it within soca/jedi.

@guillaumevernieres
Copy link

@DeniseWorthen , @junwang-noaa , here's link to the jcsda soca repo with hopefully all the needed bits and pieces:
https://github.com/JCSDA/soca/tree/develop/test/Data/36x17x25

@DeniseWorthen
Copy link
Collaborator

To utilized the cpld_gridgen utility, we need a ocean_hgrid.nc file, a topography file and a mask file for MOM6. The grid_spec file you point to seems to already be at the target resolution (17x36x35).

I see that MOM_input is asking for the super grid file GRID_FILE = "ocean_hgrid_small.nc and topography file ocean_topog_small.nc. Those are the files we'd need. Do you have those somewhere?

@guillaumevernieres
Copy link

To utilized the cpld_gridgen utility, we need a ocean_hgrid.nc file, a topography file and a mask file for MOM6. The grid_spec file you point to seems to already be at the target resolution (17x36x35).

I see that MOM_input is asking for the super grid file GRID_FILE = "ocean_hgrid_small.nc and topography file ocean_topog_small.nc. Those are the files we'd need. Do you have those somewhere?

yes, sorry, I thought these were in one of the sub-dir but apparently not. I'll get back to you when I find these files.

@guillaumevernieres
Copy link

@DeniseWorthen , the files are on MSU:

/work/noaa/da/gvernier/mom6-10geg/36x17x25

do you need the MOM.res.nc restart as well?

@junwang-noaa
Copy link
Collaborator

@yangfanglin do you have a physics package for the ultra low resolutions (~800km/400km) that we can use to set up the C12/C24 with 32 vertical level tests? Thanks

@DeniseWorthen
Copy link
Collaborator

@DeniseWorthen , the files are on MSU:

/work/noaa/da/gvernier/mom6-10geg/36x17x25

do you need the MOM.res.nc restart as well?

I don't need restarts to generate the fix files. Are you setting the land mask = 0 where depth =0 (unless you have a mask file somewhere).

@GeorgeGayno-NOAA
Copy link
Contributor

Note: UFS_UTILS requires some slight updates to create low-resolution grids. This is being worked here:
ufs-community/UFS_UTILS#1000

@DeniseWorthen
Copy link
Collaborator

@guillaumevernieres I'm not making sense of the ocean_hgrid_small file. This should contain 2x the desired resolution, so for a final resolution of 36x17, it should be 72x34. I see 72x70 in the file.

@DeniseWorthen
Copy link
Collaborator

Establishing a 10-deg ocean resolution is going to play havoc w/ the naming convention and have knock-on impacts from fix file generation down through the regression test scripts and inputs etc. This is because currently we use a 3-character string to denote the ocean/ice resolution. For example, mx050 is 1/2 deg, mx100 is one deg etc. Creating a 10-deg resolution will require a 4 character string--so the 1/2 deg will need to be mx0050 and 5-deg will be mx0500 and 10-deg will be mx1000.

I wonder if a 9-deg ocean/ice resolution would be a better idea. That would be a 40x20 grid (vs 10deg = 36x17) but would avoid the issue with file naming and file regeneration, RT modifications etc.

@danholdaway
Copy link
Author

Thanks for reporting this @DeniseWorthen. Probably not worth changing the entire naming convention for this and I think 9 degree would suffice.

@DeniseWorthen
Copy link
Collaborator

DeniseWorthen commented Dec 4, 2024

I've been able to create a 9-deg configuration for ocean and produce the associated CICE grid files using the ufs-utils/cpld_gridgen utility. I'll also document this in the ufs-utils PR I'll open. I have not tried to run this yet, but I'll try w/ a DATM-OCN-ICE configuration next.

The ufs-utils repo has some of the require fre-nctools available to build, but not the make_topog tool, which is also required. I found the tools were installed system-wide on Gaea-C5, so I was able to use those to do the following:

  1. make the supergrid for 9deg
/ncrc/home2/fms/local/opt/fre-nctools/2024.04/ncrc5/bin/make_hgrid --grid_type tripolar_grid --nxbnd 2 --nybnd 2 --xbnd -280,80 --ybnd -85,90 --nlon 80 --nlat 40 --grid_name ocean_hgrid --center c_cell
  1. make the ocean mosaic
/ncrc/home2/fms/local/opt/fre-nctools/2024.04/ncrc5/bin/make_solo_mosaic --num_tiles 1 --dir ./ --mosaic_name ocean_mosaic --tile_file ocean_hgrid.nc --periodx 360
  1. make the ocean topog
/ncrc/home2/fms/local/opt/fre-nctools/2024.04/ncrc5/bin/make_topog --mosaic ocean_mosaic.nc --topog_type realistic --topog_file /ncrc/home2/fms/archive/mom4/input_data/OCCAM_p5degree.nc --topog_field TOPO --scale_factor=-1
  1. make the ocean mask
ncap2 -s 'maskdpt=0.0*depth;where(depth>=10.)maskdpt=1.0;tmask=float(maskdpt);tmask(0,:)=0.0' topog.nc ocean_mask.nc
ncks -O -x -v depth,maskdpt ocean_mask.nc ocean_mask.nc
ncrename -v tmask,mask ocean_mask.nc
ncatted -a standard_name,,d,, -a units,,d,, ocean_mask.nc

@guillaumevernieres

@danholdaway
Copy link
Author

Great progress, thank you @DeniseWorthen

@DeniseWorthen
Copy link
Collaborator

DeniseWorthen commented Dec 4, 2024

Well, by golly, it ran! I completed 24 hours using the DATM coupled to the 9deg MOM6/CICE6.

/work2/noaa/stmp/dworthen/CPLD_GRIDGEN/rt_349784/datm.mx900

@guillaumevernieres
Copy link

Well, by golly, it ran! I completed 24 hours using the DATM coupled to the 9deg MOM6/CICE6.

You rock @DeniseWorthen 🎉 🎉 Thanks for doing this!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

6 participants