From 0b0fbffd4a0023146ec0e7b1b6037d8b4ab5fb45 Mon Sep 17 00:00:00 2001 From: jwolff-ncar Date: Tue, 2 Mar 2021 10:03:18 -0700 Subject: [PATCH] Feature/doc updates into release/public-v1 (#127) * Update path to fix files and sample model data on AWS * Updated links to several model archives * Update links for final versions and make a few small edits * Update link to Chpt 10 * Final tweaks * Updated link for UFS_UTILS doc * Update DOI publish date in README * Update wording of citation in Introduction * First pass at addressing Dom's comments. More to come, especially related to platform environments. * More small edits. More to come. * A few more updates. Still more to come. * Getting closer to addressing all of Dom's concerns. * Another small edit * A few more edits to the graphics chapter to add an example command line for the diff plot * Updated some remaining concens raised by Dom. This will now be tested and the final issues will be added prior to a PR. * More updates to the graphics chapter * Added subsections for easier reading --- docs/UsersGuide/source/CodeReposAndDirs.rst | 2 +- docs/UsersGuide/source/ConfigNewPlatform.rst | 32 +++--- docs/UsersGuide/source/ConfigParameters.inc | 18 ++-- docs/UsersGuide/source/ConfigWorkflow.rst | 12 +-- docs/UsersGuide/source/Graphics.rst | 61 +++++++---- docs/UsersGuide/source/InputOutputFiles.rst | 10 +- docs/UsersGuide/source/Introduction.rst | 2 +- docs/UsersGuide/source/LAMGrids.rst | 14 +-- docs/UsersGuide/source/Quickstart.rst | 100 ++++++++++++++++--- docs/UsersGuide/source/SRWAppOverview.rst | 93 ++++++++--------- 10 files changed, 220 insertions(+), 124 deletions(-) diff --git a/docs/UsersGuide/source/CodeReposAndDirs.rst b/docs/UsersGuide/source/CodeReposAndDirs.rst index edfdab16e7..c5f429b0c5 100644 --- a/docs/UsersGuide/source/CodeReposAndDirs.rst +++ b/docs/UsersGuide/source/CodeReposAndDirs.rst @@ -157,7 +157,7 @@ workflow is run, is shown in :numref:`Table %s `. +---------------------------+-------------------------------------------------------------------------------------------------------+ | data_table | Cycle-independent input file (empty) | +---------------------------+-------------------------------------------------------------------------------------------------------+ - | field_table | Scalar fields in the `forecast model | + | field_table | Tracers in the `forecast model | | | `_ | +---------------------------+-------------------------------------------------------------------------------------------------------+ | FV3LAM_wflow.xml | Rocoto XML file to run the workflow | diff --git a/docs/UsersGuide/source/ConfigNewPlatform.rst b/docs/UsersGuide/source/ConfigNewPlatform.rst index 5c39d86d1f..a7722c6d94 100644 --- a/docs/UsersGuide/source/ConfigNewPlatform.rst +++ b/docs/UsersGuide/source/ConfigNewPlatform.rst @@ -14,11 +14,11 @@ The first step to installing on a new machine is to install :term:`NCEPLIBS` (ht * C and C++ compilers compatible with the Fortran compiler - * gcc v9+, ifort v18+, and clang (MacOS) have been tested + * gcc v9+, ifort v18+, and clang v9+ (macOS, native Apple clang or LLVM clang) have been tested * Python v3.6+ - * Prerequisite packages must be downloaded: jinja2, yaml and f90nml, as well as pygrib if the user would like to use the provided graphics scripts + * Prerequisite packages must be downloaded: jinja2, yaml and f90nml, as well as a number of additional Python modules (see :numref:`Section %s `) if the user would like to use the provided graphics scripts * Perl 5 @@ -28,16 +28,20 @@ The first step to installing on a new machine is to install :term:`NCEPLIBS` (ht * CMake v3.15+ is needed for building NCEPLIBS, but versions as old as 3.12 can be used to build NCEPLIBS-external, which contains a newer CMake that can be used for the rest of the build. -For Linux systems, as long as the above software is available, you can move on to the next step: installing the :term:`NCEPLIBS-external` package. - -For MacOS systems, you will also need to set the stack size to “unlimited”. +For both Linux and macOS, you will need to set the stack size to "unlimited" (if allowed) or the largest possible value. .. code-block:: console + # Linux, if allowed + ulimit -s unlimited + + # macOS, this corresponds to 65MB ulimit -S -s unlimited -Additionally, some extra software is needed: ``wget``, ``coreutils``, ``pkg-config``, and ``gnu-sed``. -It is recommended that you install this software using the Homebrew package manager for MacOS (https://brew.sh/): +For Linux systems, as long as the above software is available, you can move on to the next step: installing the :term:`NCEPLIBS-external` package. + +For macOS systems, some extra software is needed: ``wget``, ``coreutils``, ``pkg-config``, and ``gnu-sed``. +It is recommended that you install this software using the Homebrew package manager for macOS (https://brew.sh/): * brew install wget @@ -160,7 +164,7 @@ At this point there are just a few more variables that need to be set prior to b export CMAKE_CXX_COMPILER=mpicxx export CMAKE_Fortran_COMPILER=mpifort -If you are using your machine’s built-in MPI compilers, it is recommended you set the ``CMAKE_*_COMPILER`` flags to full paths to ensure that the correct MPI aliases are used. Finally, one last environment variable, ``CMAKE_Platform``, must be set. This will depend on your machine; for example, on a MacOS operating system with GNU compilers: +If you are using your machine’s built-in MPI compilers, it is recommended you set the ``CMAKE_*_COMPILER`` flags to full paths to ensure that the correct MPI aliases are used. Finally, one last environment variable, ``CMAKE_Platform``, must be set. This will depend on your machine; for example, on a macOS operating system with GNU compilers: .. code-block:: console @@ -198,7 +202,7 @@ Running the graphics scripts in ``${WORKDIR}/ufs-srweather-app/regional_workflow For the final step of creating and running an experiment, the exact methods will depend on if you are running with or without a workflow manager (Rocoto). -Running Without a Workflow Manager: Generic Linux and MacOS Platforms +Running Without a Workflow Manager: Generic Linux and macOS Platforms ===================================================================== Now that the code has been built, you can stage your data as described in :numref:`Section %s `. @@ -244,7 +248,7 @@ From here, you can run each individual task of the UFS SRW App using the provide cp ${WORKDIR}/ufs-srweather-app/regional_workflow/ush/wrappers/*sh . cp ${WORKDIR}/ufs-srweather-app/regional_workflow/ush/wrappers/README.md . -The ``README.md`` file will contain instructions on the order that each script should be run in. An example of wallclock times for each task for an example run (2017 Macbook Pro, MacOS Catalina, 25km CONUS domain, 48hr forecast) is listed in :numref:`Table %s `. +The ``README.md`` file will contain instructions on the order that each script should be run in. An example of wallclock times for each task for an example run (2017 Macbook Pro, macOS Catalina, 25km CONUS domain, 48hr forecast) is listed in :numref:`Table %s `. .. _WallClockTimes: @@ -341,7 +345,7 @@ Those requirements highlighted in **bold** are included in the NCEPLIBS-external * 4GB memory (CONUS 25km domain) -* Fortran compiler with full Fortran 2003 standard support +* Fortran compiler with full Fortran 2008 standard support * C and C++ compiler @@ -361,13 +365,13 @@ Those requirements highlighted in **bold** are included in the NCEPLIBS-external * **netCDF (C and Fortran libraries)** * **HDF5** - * **ESMF** + * **ESMF** 8.0.0 * **Jasper** * **libJPG** * **libPNG** * **zlib** -MacOS-specific prerequisites: +macOS-specific prerequisites: * brew install wget * brew install cmake @@ -381,4 +385,4 @@ Optional but recommended prerequisites: * Bash v4+ * Rocoto Workflow Management System (1.3.1) * **CMake v3.15+** -* Python package pygrib for graphics +* Python packages scipy, matplotlib, pygrib, cartopy, and pillow for graphics diff --git a/docs/UsersGuide/source/ConfigParameters.inc b/docs/UsersGuide/source/ConfigParameters.inc index c10e916cf5..b67ed0c0cc 100644 --- a/docs/UsersGuide/source/ConfigParameters.inc +++ b/docs/UsersGuide/source/ConfigParameters.inc @@ -5,7 +5,9 @@ Grid Generation Parameters ========================== ``GRID_GEN_METHOD``: (Default: “”) - This variable specifies the method to use to generate a regional grid in the horizontal. The only supported value of this parameter is “ESGgrid”, in which case the Extended Schmidt Gnomonic grid generation method developed by Jim Purser of EMC will be used. + This variable specifies the method to use to generate a regional grid in the horizontal. The only supported value of this parameter is “ESGgrid”, in which case the Extended Schmidt Gnomonic grid generation method developed by Jim Purser(1) of EMC will be used. + +(1)Purser, R. J., D. Jovic, G. Ketefian, T. Black, J. Beck, J. Dong, and J. Carley, 2020: The Extended Schmidt Gnomonic Grid for Regional Applications. Unified Forecast System (UFS) Users’ Workshop. July 27-29, 2020. .. note:: @@ -49,11 +51,11 @@ Computational Forecast Parameters ``BLOCKSIZE``: (Default: “”) The amount of data that is passed into the cache at a time. -Here, we set these parameters to null strings. This is so that, for any one of these parameters: +Here, we set these parameters to null strings. This is so that, for any one of these parameters: -#. If the experiment is using a predefined grid, then if the user sets the parameter in the user-specified experiment configuration file (``EXPT_CONFIG_FN``), that value will be used in the forecast(s). Otherwise, the default value of the parameter for that predefined grid will be used. +#. If the experiment is using a predefined grid and the user sets the parameter in the user-specified experiment configuration file (``EXPT_CONFIG_FN``), that value will be used in the forecast(s). Otherwise, the default value of the parameter for that predefined grid will be used. -#. If the experiment is not using a predefined grid (i.e. it is using a custom grid whose parameters are specified in the experiment configuration file), then the user must specify a value for the parameter in that configuration file. Otherwise, the parameter will remain set to a null string, and the experiment generation will fail because the generation scripts check to ensure that all the parameters defined in this section are set to non-empty strings before creating the experiment directory. +#. If the experiment is not using a predefined grid (i.e. it is using a custom grid whose parameters are specified in the experiment configuration file), then the user must specify a value for the parameter in that configuration file. Otherwise, the parameter will remain set to a null string, and the experiment generation will fail, because the generation scripts check to ensure that all the parameters defined in this section are set to non-empty strings before creating the experiment directory. Write-Component (Quilting) Parameters ===================================== @@ -91,7 +93,7 @@ Currently supported ``PREDEF_GRID_NAME`` options are "RRFS_CONUS_25km," "RRFS_CO Pre-existing Directory Parameter ================================ ``PREEXISTING_DIR_METHOD``: (Default: “delete”) - This variable determines the method to use to deal with pre-existing directories [e.g ones generated by previous calls to the experiment generation script using the same experiment name (``EXPT_SUBDIR``) as the current experiment]. This variable must be set to one of "delete", "rename", and "quit". The resulting behavior for each of these values is as follows: + This variable determines the method to deal with pre-existing directories [e.g ones generated by previous calls to the experiment generation script using the same experiment name (``EXPT_SUBDIR``) as the current experiment]. This variable must be set to one of "delete", "rename", and "quit". The resulting behavior for each of these values is as follows: * "delete": The preexisting directory is deleted and a new directory (having the same name as the original preexisting directory) is created. @@ -128,12 +130,12 @@ These parameters set flags (and related directories) that determine whether the Surface Climatology Parameter ============================= -``SFC_CLIMO_FIELDS``: (Default: “("facsf" "maximum_snow_albedo" "slope_type" "snowfree_albedo" "soil_type" "substrate_temperature" "vegetation_greenness" "vegetation_type")” +``SFC_CLIMO_FIELDS``: (Default: “("facsf" "maximum_snow_albedo" "slope_type" "snowfree_albedo" "soil_type" "substrate_temperature" "vegetation_greenness" "vegetation_type")”) Array containing the names of all the fields for which the ``MAKE_SFC_CLIMO_TN`` task generates files on the native FV3-LAM grid. Fixed File Parameters ===================== -Set parameters associated with the fixed (i.e. static) files. For the main NOAA HPC platforms, as well as Cheyenne, Odin, and Stampede, fixed files are prestaged in locations on each machine with paths defined in the ``setup.sh`` script. +Set parameters associated with the fixed (i.e. static) files. For the main NOAA HPC platforms, as well as Cheyenne, Odin, and Stampede, fixed files are prestaged with paths defined in the ``setup.sh`` script. ``FIXgsm``: (Default: “”) System directory in which the majority of fixed (i.e. time-independent) files that are needed to run the FV3-LAM model are located. @@ -221,7 +223,7 @@ Set parameters associated with the fixed (i.e. static) files. For the main NOAA ``CYCLEDIR_LINKS_TO_FIXam_FILES_MAPPING``: (Default: see below) .. code-block:: console - ("aerosol.dat | global_climaeropac_global.txt" \ + ("aerosol.dat | global_climaeropac_global.txt" \ "co2historicaldata_2010.txt | fix_co2_proj/global_co2historicaldata_2010.txt" \ "co2historicaldata_2011.txt | fix_co2_proj/global_co2historicaldata_2011.txt" \ "co2historicaldata_2012.txt | fix_co2_proj/global_co2historicaldata_2012.txt" \ diff --git a/docs/UsersGuide/source/ConfigWorkflow.rst b/docs/UsersGuide/source/ConfigWorkflow.rst index a8bfe8e0a7..9c85d7be1d 100644 --- a/docs/UsersGuide/source/ConfigWorkflow.rst +++ b/docs/UsersGuide/source/ConfigWorkflow.rst @@ -5,7 +5,7 @@ Configuring the Workflow: ``config.sh`` and ``config_defaults.sh`` ================================================================== To create the experiment directory and workflow when running the SRW App, the user must create an experiment configuration file named ``config.sh``. This file contains experiment-specific information, such as dates, external model data, directories, and other relevant settings. To help the user, two sample configuration files have been included in the ``regional_workflow`` repository’s ``ush`` directory: ``config.community.sh`` and ``config.nco.sh``. The first is for running experiments in community mode (``RUN_ENVIR`` set to “community”; see below), and the second is for running experiments in “nco” mode (``RUN_ENVIR`` set to “nco”). Note that for this release, only “community” mode is supported. These files can be used as the starting point from which to generate a variety of experiment configurations in which to run the SRW App. -There is an extensive list of experiment parameters that a user can set when configuring the experiment. All of these do not need to be explicitly set by the user in ``config.sh``. In the case that a user does not define an entry in the ```config.sh`` script, either its value in ``config_defaults.sh`` will be used, or it will be reset depending on other parameters, e.g. the platform on which the experiment will be run (specified by ``MACHINE``) Note that ``config_defaults.sh`` contains the full list of experiment parameters that a user may set in ``config.sh`` (i.e. the user cannot set parameters in config.sh that are not initialized in ``config_defaults.sh``). +There is an extensive list of experiment parameters that a user can set when configuring the experiment. Not all of these need to be explicitly set by the user in ``config.sh``. In the case that a user does not define an entry in the ``config.sh`` script, either its value in ``config_defaults.sh`` will be used, or it will be reset depending on other parameters, e.g. the platform on which the experiment will be run (specified by ``MACHINE``). Note that ``config_defaults.sh`` contains the full list of experiment parameters that a user may set in ``config.sh`` (i.e. the user cannot set parameters in config.sh that are not initialized in ``config_defaults.sh``). The following is a list of the parameters in the ``config_defaults.sh`` file. For each parameter, the default value and a brief description is given. In addition, any relevant information on features and settings supported or unsupported in this release is specified. @@ -68,7 +68,7 @@ These settings control run commands for platforms without a workflow manager. V The run command for pre-processing utilities (shave, orog, sfc_climo_gen, etc.). This can be left blank for smaller domains, in which case the executables will run without MPI. ``RUN_CMD_FCST``: (Default: "mpirun -np \${PE_MEMBER01}") - The run command for the model forecast step. This will be appended to the end of the variable definitions file (var_defns.sh).. + The run command for the model forecast step. This will be appended to the end of the variable definitions file ("var_defns.sh"). ``RUN_CMD_POST``: (Default: "mpirun -np 1") The run command for post-processing (UPP). Can be left blank for smaller domains, in which case UPP will run without MPI. @@ -84,7 +84,7 @@ Cron-Associated Parameters Directory Parameters ==================== ``EXPT_BASEDIR``: (Default: “”) - The base directory in which the experiment directory will be created. If this is not specified or if it is set to an empty string, it will default to ``${HOMErrfs}/../expt_dirs``, where ``${HOMErrfs}`` contains the full path to the ``regional_workflow`` directory. + The base directory in which the experiment directory will be created. If this is not specified or if it is set to an empty string, it will default to ``${HOMErrfs}/../../expt_dirs``, where ``${HOMErrfs}`` contains the full path to the ``regional_workflow`` directory. ``EXPT_SUBDIR``: (Default: “”) The name that the experiment directory (without the full path) will have. The full path to the experiment directory, which will be contained in the variable ``EXPTDIR``, will be: @@ -211,7 +211,7 @@ Initial and Lateral Boundary Condition Generation Parameters The name of the external model that will provide fields from which lateral boundary condition (LBC) files (except for the 0-th hour LBC file) will be generated for input into the forecast model. ``LBC_SPEC_INTVL_HRS``: (Default: “6”) - The interval (in integer hours) with which LBC files will be generated, referred to as the boundary specification interval. Note that the model specified in ``EXTRN_MDL_NAME_LBCS`` must have data available at a frequency greater than or equal to that implied by ``LBC_SPEC_INTVL_HRS``. For example, if ``LBC_SPEC_INTVL_HRS`` is set to 6, then the model must have data available at least every 6 hours. It is up to the user to ensure that this is the case. + The interval (in integer hours) at which LBC files will be generated, referred to as the boundary specification interval. Note that the model specified in ``EXTRN_MDL_NAME_LBCS`` must have data available at a frequency greater than or equal to that implied by ``LBC_SPEC_INTVL_HRS``. For example, if ``LBC_SPEC_INTVL_HRS`` is set to 6, then the model must have data available at least every 6 hours. It is up to the user to ensure that this is the case. ``FV3GFS_FILE_FMT_ICS``: (Default: “nemsio”) If using the FV3GFS model as the source of the ICs (i.e. if ``EXTRN_MDL_NAME_ICS`` is set to "FV3GFS"), this variable specifies the format of the model files to use when generating the ICs. @@ -225,7 +225,7 @@ User-Staged External Model Directory and File Parameters Flag that determines whether or not the workflow will look for the external model files needed for generating ICs and LBCs in user-specified directories (as opposed to fetching them from mass storage like NOAA HPSS). ``EXTRN_MDL_SOURCE_BASEDIR_ICS``: (Default: “/base/dir/containing/user/staged/extrn/mdl/files/for/ICs") - Directory in which to look for external model files for generating ICs. If ``USE_USER_STAGED_EXTRN_FILES`` is set to "TRUE", the workflow looks in this directory (specifically, in a subdirectory under this directory named "YYYYMMDDHH" consisting of the starting date and cycle hour of the forecast, where YYYY is the 4-digit year, MM the 2-digit month, DD the 2-digit day of the month, and HH the 2-digit hour of the day) for the external model files specified by the array ``EXTRN_MDL_FILES_ICS``` (these files will be used to generate the ICs on the native FV3-LAM grid). This variable is not used if ``USE_USER_STAGED_EXTRN_FILES`` is set to "FALSE". + Directory in which to look for external model files for generating ICs. If ``USE_USER_STAGED_EXTRN_FILES`` is set to "TRUE", the workflow looks in this directory (specifically, in a subdirectory under this directory named "YYYYMMDDHH" consisting of the starting date and cycle hour of the forecast, where YYYY is the 4-digit year, MM the 2-digit month, DD the 2-digit day of the month, and HH the 2-digit hour of the day) for the external model files specified by the array ``EXTRN_MDL_FILES_ICS`` (these files will be used to generate the ICs on the native FV3-LAM grid). This variable is not used if ``USE_USER_STAGED_EXTRN_FILES`` is set to "FALSE". ``EXTRN_MDL_FILES_ICS``: (Default: "ICS_file1” “ICS_file2” “...”) Array containing the names of the files to search for in the directory specified by ``EXTRN_MDL_SOURCE_BASEDIR_ICS``. This variable is not used if ``USE_USER_STAGED_EXTRN_FILES`` is set to "FALSE". @@ -239,6 +239,6 @@ User-Staged External Model Directory and File Parameters CCPP Parameter ============== ``CCPP_PHYS_SUITE``: (Default: "FV3_GFS_v15p2") - The CCPP (Common Community Physics Package) physics suite to use for the forecast(s). The choice of physics suite determines the forecast model's namelist file, the diagnostics table file, the field table file, and the XML physics suite definition file that are staged in the experiment directory or the cycle directories under it. Current supported settings for this parameter are “FV3_GFS_v15p2” and “RRFS_v1alpha.” + The CCPP (Common Community Physics Package) physics suite to use for the forecast(s). The choice of physics suite determines the forecast model's namelist file, the diagnostics table file, the field table file, and the XML physics suite definition file that are staged in the experiment directory or the cycle directories under it. Current supported settings for this parameter are “FV3_GFS_v15p2” and “FV3_RRFS_v1alpha”. .. include:: ConfigParameters.inc diff --git a/docs/UsersGuide/source/Graphics.rst b/docs/UsersGuide/source/Graphics.rst index ac46a495cf..ccff303413 100644 --- a/docs/UsersGuide/source/Graphics.rst +++ b/docs/UsersGuide/source/Graphics.rst @@ -44,7 +44,7 @@ On Hera: .. code-block:: console - /scratch2/NCEPDEV/fv3-cam/Chan-hoo.Jeon/tools/NaturalEarth + /scratch2/BMC/det/UFS_SRW_app/v1p0/fix_files/NaturalEarth On Jet: @@ -56,7 +56,7 @@ On Orion: .. code-block:: console - /home/chjeon/tools/NaturalEarth + /work/noaa/gsd-fv3-dev/UFS_SRW_App/v1p0/fix_files/NaturalEarth On Gaea: @@ -78,6 +78,7 @@ On Cheyenne: .. code-block:: console + module load ncarenv ncar_pylib /glade/p/ral/jntp/UFS_SRW_app/ncar_pylib/python_graphics On Hera and Jet: @@ -100,15 +101,17 @@ On Gaea: .. code-block:: console - module use -a /apps/contrib/miniconda3-noaa-gsl/modulefiles - module load miniconda3 - conda activate pygraf + module use /lustre/f2/pdata/esrl/gsd/contrib/modulefiles + module load miniconda3/4.8.3-regional-workflow .. note:: If using one of the batch submission scripts described below, the user does not need to manually load an environment because the scripts perform this task. +Plotting output from one experiment +=================================== + Before generating plots, it is convenient to change location to the directory containing the plotting scripts: @@ -138,6 +141,9 @@ The output files (in .png format) will be located in the directory ``EXPTDIR/CDA where in this case ``EXPTDIR`` is ``/path-to/expt_dirs/test_CONUS_25km_GFSv15p2`` and ``CDATE`` is ``2019061500``. +Plotting differences from two experiments +========================================= + To generate difference plots, the ``plot_allvars_diff.py`` script must be called with the following seven command line arguments: @@ -149,27 +155,34 @@ seven command line arguments: #. The top level of the first experiment directory ``EXPTDIR2`` containing the second set of post-processed data. The script will look for the data files in the directory ``EXPTDIR2/CDATE/postprd``. #. The base directory ``CARTOPY_DIR`` of the cartopy shapefiles. The script will look for the shape files (``*.shp``) in the directory ``CARTOPY_DIR/shapefiles/natural_earth/cultural``. -In this case, the output png files will be located in the directory ``EXPTDIR1/CDATE/postprd``. +An example of plotting differences from two experiments for the same date and predefined domain where one uses +the "FV3_GFS_v15p2" suite definition file (SDF) and one using the "FV3_RRFS_v1alpha" SDF is as follows: +.. code-block:: console -If the Python scripts are being used to create plots of multiple forecast lead times and forecast -variables, then they should be submitted to the batch system using either the ``sq_job.sh`` -or ``sq_job_diff.sh`` script (for platforms such as Hera, Jet, Orion, and Gaea that use slurm as -the job scheduler) or the ``qsub_job.sh`` or ``qsub_job_diff.sh`` script (for platforms such as -Cheyenne that use PBS or PBS Pro as the job scheduler). These scripts are located under -``ufs-srweather-app/regional_workflow/ush/Python`` and must be submitted using the command appropriate -for the job scheduler used on the current platform. For example, on Hera, Jet, Orion, and Gaea, -``sq_job.sh`` can be submitted as follows: + python plot_allvars_diff.py 2019061518 6 18 3 /path-to/expt_dirs1/test_CONUS_3km_GFSv15p2 /path-to/expt_dirs2/test_CONUS_3km_RRFSv1alpha /path-to/NaturalEarth -.. code-block:: console +In this case, the output png files will be located in the directory ``EXPTDIR1/CDATE/postprd``. - sbatch sq_job.sh +Submitting plotting scripts through a batch system +================================================== -On Cheyenne, ``qsub_job.sh`` can be submitted as follows: +If the Python scripts are being used to create plots of multiple forecast lead times and forecast +variables, then you may need to submit them to the batch system. Example scripts are provided called +``sq_job.sh`` and ``sq_job_diff.sh`` for use on a platform such as Hera that uses the Slurm +job scheduler or ``qsub_job.sh`` and ``qsub_job_diff.sh`` for use on a platform such as +Cheyenne that uses PBS as the job scheduler. Examples of these scripts are located under +``ufs-srweather-app/regional_workflow/ush/Python`` and can be used as a starting point to create a batch script +for your platform/job scheduler of use. + +At a minimum, the account should be set appropriately prior to job submission: .. code-block:: console - qsub qsub_job.sh + #SBATCH --account=an_account + +Depending on the platform you are running on, you may also need to adjust the settings to use +the correct Python environment and path to the shape files. When using these batch scripts, several environment variables must be set prior to submission. If plotting output from a single cycle, the variables to set are ``HOMErrfs`` and ``EXPTDIR``. @@ -227,4 +240,16 @@ and ending with the last forecast hour, use export FCST_END=${FCST_LEN_HRS} export FCST_INC=6 +The scripts must be submitted using the command appropriate +for the job scheduler used on your platform. For example, on Hera, +``sq_job.sh`` can be submitted as follows: +.. code-block:: console + + sbatch sq_job.sh + +On Cheyenne, ``qsub_job.sh`` can be submitted as follows: + +.. code-block:: console + + qsub qsub_job.sh diff --git a/docs/UsersGuide/source/InputOutputFiles.rst b/docs/UsersGuide/source/InputOutputFiles.rst index 3fc839a0cd..3e1e14effc 100644 --- a/docs/UsersGuide/source/InputOutputFiles.rst +++ b/docs/UsersGuide/source/InputOutputFiles.rst @@ -75,12 +75,12 @@ and are shown in :numref:`Table %s `. | | the start of each forecast. It is an empty file. No need to | | | change. | +-----------------------------+-------------------------------------------------------------+ - | data_table_[CCPP] | File specifying the output fields of the forecast model. | + | diag_table_[CCPP] | File specifying the output fields of the forecast model. | | | A different diag_table may be configured for different | | | CCPP suites. | +-----------------------------+-------------------------------------------------------------+ | field_table_[CCPP] | Cycle-independent file that the forecast model reads in at | - | | the start of each forecast. It specifies the scalars that | + | | the start of each forecast. It specifies the tracers that | | | the forecast model will advect. A different field_table | | | may be needed for different CCPP suites. | +-----------------------------+-------------------------------------------------------------+ @@ -237,8 +237,8 @@ Static Files A set of fix files are necessary to run the SRW Application. Environment variables describe the location of the static files: ``FIXgsm``, ``TOPO_DIR``, and ``SFC_CLIMO_INPUT_DIR`` are the directories where the static files are located. If you are on a pre-configured or configurable platform, there is no -need to stage the fixed files manually because they are already available on those platforms and the paths -are set in ``regional_workflow/ush/setup.sh`` for the static files. If the users platform is not defined +need to stage the fixed files manually because they have been prestaged and the paths +are set in ``regional_workflow/ush/setup.sh``. If the user's platform is not defined in that file, the static files can be pulled individually or as a full tar file from the `FTP data repository `_ or from `Amazon Web Services (AWS) cloud storage `_ @@ -406,7 +406,7 @@ are initializing from NEMSIO format FV3GFS files. Best Practices for Conserving Disk Space and Keeping Files Safe --------------------------------------------------------------- Initial and lateral boundary condition files are large and can occupy a significant amount of -disk space. If various users will employ a common files system to conduct runs, it is recommended +disk space. If various users will employ a common file system to conduct runs, it is recommended that the users share the same ``EXTRN_MDL_SOURCE_BASEDIR_ICS`` and ``EXTRN_MDL_SOURCE_BASEDIR_LBCS`` directories. That way, if raw model input files are already on disk for a given date they do not need to be replicated. diff --git a/docs/UsersGuide/source/Introduction.rst b/docs/UsersGuide/source/Introduction.rst index 4e81dd2629..7da355e7f6 100644 --- a/docs/UsersGuide/source/Introduction.rst +++ b/docs/UsersGuide/source/Introduction.rst @@ -145,7 +145,7 @@ have been defined for the SRW Application, including pre-configured (level 1), c (level 2), limited test platforms (level 3), and build only platforms (level 4). Each level is further described below. -For the select computational platforms that have been pre-configured (level 1), all the +For the selected computational platforms that have been pre-configured (level 1), all the required libraries for building the SRW Application are available in a central place. That means bundled libraries (NCEPLIBS) and third-party libraries (NCEPLIBS-external) have both been built. The SRW Application is expected to build and run out of the box on these diff --git a/docs/UsersGuide/source/LAMGrids.rst b/docs/UsersGuide/source/LAMGrids.rst index 650ac91bc4..f76f3565d8 100644 --- a/docs/UsersGuide/source/LAMGrids.rst +++ b/docs/UsersGuide/source/LAMGrids.rst @@ -31,19 +31,19 @@ of the following three options: The predefined grids are named after the prototype 3-km continental United States (CONUS) grid being tested for the Rapid Refresh Forecast System (RRFS), which will be a convection-allowing, hourly-cycled, FV3-LAM-based ensemble planned for operational implementation in 2024. To allow -for use of HRRR data to initialize the SRW App, all three supported grids were created to fit completely within -the HRRR domain to allow external model data from the HRRR to be used as initial conditions for -the FV3-LAM. Three resolution options were provided for flexibility related to compute resources +for use of High Resolution Rapid Refresh (`HRRR `_) data to +initialize the SRW App, all three supported grids were created to fit completely within the HRRR domain. +Three resolution options were provided for flexibility related to compute resources and physics options. For example, a user may wish to use the 13-km or 25-km domain when running -with the ``FV3_GFS_v15p2`` suite definition file (SDF), since that SDF uses cumulus physics which is -not currently configured to run at 3-km. In addition, users will have much fewer computational +with the ``FV3_GFS_v15p2`` suite definition file (SDF), since that SDF uses cumulus physics that are +not configured to run at 3-km. In addition, users will have much fewer computational constraints when running with the 13-km and 25-km domains. The boundary of the ``RRFS_CONUS_3km`` domain is shown in :numref:`Figure %s ` (in red). Note that while it is possible to initialize the FV3-LAM with coarser external model data when using the ``RRFS_CONUS_3km`` domain, it is generally advised to use external model data that has a resolution similar to that of the native FV3-LAM (predefined) grid. In addition, this grid is ideal for running the -``FV3_RRFS_v1alpha`` SDF, since it was specifically created for convection-allowing scales, and is the +``FV3_RRFS_v1alpha`` suite definition file (SDF), since this SDF was specifically created for convection-allowing scales, and is the precursor to the operational physics suite that will be used in the RRFS. As can be seen in :numref:`Figure %s `, the boundary of the write-component grid (in blue) sits @@ -80,7 +80,7 @@ While the three predefined grids available in this release are ideal for users j out with the SRW App, more advanced users may wish to create their own grid for testing over a different region and/or with a different resolution. Creating a user-defined grid requires knowledge of how the SRW App workflow functions, in particular, understanding the set of -scripts that handle the workflow and experiment generation. It’s also important to note that +scripts that handle the workflow and experiment generation. It is also important to note that user-defined grids are not a supported feature of the current release, however information is being provided for the benefit of the FV3-LAM community. diff --git a/docs/UsersGuide/source/Quickstart.rst b/docs/UsersGuide/source/Quickstart.rst index 46ae160871..6d87dcd15a 100644 --- a/docs/UsersGuide/source/Quickstart.rst +++ b/docs/UsersGuide/source/Quickstart.rst @@ -28,7 +28,7 @@ The necessary source code is publicly available on GitHub. To clone the release .. code-block:: console - git clone -b release/public-v1 https://github.com/ufs-community/ufs-srweather-app.git + git clone -b ufs-v1.0.0 https://github.com/ufs-community/ufs-srweather-app.git cd ufs-srweather-app Then, check out the submodules for the SRW application: @@ -76,7 +76,7 @@ Run ``cmake`` to set up the ``Makefile``, then run ``make``: Output from the build will be in the ``ufs-srweather-app/build/build.out`` file. When the build completes, you should see the forecast model executable ``NEMS.exe`` and eleven pre- and post-processing executables in the ``ufs-srweather-app/bin`` directory which are -described in :numref:`Table %s `. +described in :numref:`Table %s `. Generate the Workflow Experiment ================================ @@ -102,11 +102,11 @@ fully supported for this release while the operational mode will be more exclusi Central Operations (NCO) and those in the NOAA/NCEP/Environmental Modeling Center (EMC) working with NCO on pre-implementation testing. Sample config.sh files are discussed in this section for Level 1 platforms. -Make a copy of ``config.community.sh`` to get started: +Make a copy of ``config.community.sh`` to get started (under /path-to-ufs-srweather-app/regional_workflow/ush): .. code-block:: console - cd ufs-srweather-app/regional_workflow/ush + cd ../regional_workflow/ush cp config.community.sh config.sh Edit the ``config.sh`` file to set the machine you are running on to ``MACHINE``, use an account you can charge for @@ -153,6 +153,14 @@ For Orion: ACCOUNT="my_account" EXPT_SUBDIR="my_expt_name" +For Gaea: + +.. code-block:: console + + MACHINE="gaea" + ACCOUNT="my_account" + EXPT_SUBDIR="my_expt_name" + For WCOSS, edit ``config.sh`` with these WCOSS-specific parameters, and use a valid WCOSS project code for the account parameter: @@ -169,11 +177,11 @@ Set up the Python and other Environment Parameters Next, it is necessary to load the appropriate Python environment for the workflow. The workflow requires Python 3, with the packages 'PyYAML', 'Jinja2', and 'f90nml' available. This Python environment has already been set up on Level 1 platforms, and can be activated in -the following way: +the following way (when in /path-to-ufs-srweather-app/regional_workflow/ush): .. code-block:: console - source ufs-srweather-app/env/wflow_.env + source ../../env/wflow_.env Run the ``generate_FV3LAM_wflow.sh`` script ------------------------------------------- @@ -183,8 +191,10 @@ For all platforms, the workflow can then be generated with the command: ./generate_FV3LAM_wflow.sh -The generated workflow will be in ``$EXPTDIR``, where ``EXPTDIR=${EXPT_BASEDIR}/${EXPT_SUBDIR}``. The -settings for these paths can be found in the output from the ``./generate_FV3LAM_wflow.sh`` script. +The generated workflow will be in ``$EXPTDIR``, where ``EXPTDIR=${EXPT_BASEDIR}/${EXPT_SUBDIR}``. A +log file called ``log.generate_FV3LAM_wflow`` is generated by this step and can also be found in +``$EXPTDIR``. The settings for these paths can be found in the output from the +``./generate_FV3LAM_wflow.sh`` script. Run the Workflow Using Rocoto ============================= @@ -193,6 +203,19 @@ If Rocoto is not available, it is still possible to run the workflow using stand described in :numref:`Section %s `. There are two ways you can run the workflow with Rocoto using either the ``./launch_FV3LAM_wflow.sh`` or by hand. +An environment variable may be set to navigate to the ``$EXPTDIR`` more easily. If the login +shell is bash, it can be set as follws: + +.. code-block:: console + + export EXPTDIR=/path-to-experiment/directory + +Or if the login shell is csh/tcsh, it can be set using: + +.. code-block:: console + + setenv EXPTDIR /path-to-experiment/directory + To run Rocoto using the script: .. code-block:: console @@ -200,22 +223,73 @@ To run Rocoto using the script: cd $EXPTDIR ./launch_FV3LAM_wflow.sh +Once the workflow is launched with the ``launch_FV3LAM_wflow.sh`` script, a log file named +``log.launch_FV3LAM_wflow`` will be created (or appended to it if it already exists) in ``EXPTDIR``. + Or to manually call Rocoto: +First load the Rocoto module, depending on the platform used. + +For Cheyenne: + +.. code-block:: console + + module use -a /glade/p/ral/jntp/UFS_SRW_app/modules/ + module load rocoto + +For Hera or Jet: + +.. code-block:: console + + module purge + module load rocoto + +For Orion: + +.. code-block:: console + + module purge + module load contrib rocoto + +For Gaea: + +.. code-block:: console + + module use /lustre/f2/pdata/esrl/gsd/contrib/modulefiles + module load rocoto/1.3.3 + +For WCOSS_DELL_P3: + +.. code-block:: console + + module purge + module load lsf/10.1 + module use /gpfs/dell3/usrx/local/dev/emc_rocoto/modulefiles/ + module load ruby/2.5.1 rocoto/1.2.4 + +For WCOSS_DELL_P3: + +.. code-block:: console + + module purge + module load xt-lsfhpc/9.1.3 + module use -a /usrx/local/emc_rocoto/modulefiles + module load rocoto/1.2.4 + +Then manually call ``rocotorun`` to launch the tasks that have all dependencies satisfied +and ``rocotostat`` to monitor the progress: + .. code-block:: console cd $EXPTDIR rocotorun -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10 rocotostat -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10 -For automatic resubmission of the workflow (every 3 minutes), one of the following lines can be added -to the user's crontab (use ``crontab -e`` to edit the cron table) depending on your preference of how -you call Rocoto: +For automatic resubmission of the workflow (e.g., every 3 minutes), the following line can be added +to the user's crontab (use ``crontab -e`` to edit the cron table). .. code-block:: console - */3 * * * * cd /glade/p/ral/jntp/$USER/expt_dirs/test_CONUS_25km_GFSv15p2 && /glade/p/ral/jntp/tools/rocoto/rocoto-1.3.1/bin/rocotorun -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10 - -- OR -- */3 * * * * cd /glade/p/ral/jntp/$USER/expt_dirs/test_CONUS_25km_GFSv15p2 && ./launch_FV3LAM_wflow.sh .. note:: diff --git a/docs/UsersGuide/source/SRWAppOverview.rst b/docs/UsersGuide/source/SRWAppOverview.rst index 137d040c0c..ef71c80805 100644 --- a/docs/UsersGuide/source/SRWAppOverview.rst +++ b/docs/UsersGuide/source/SRWAppOverview.rst @@ -7,7 +7,7 @@ The UFS Short-Range Weather Application (SRW App) is an umbrella repository that ``manage_externals`` to check out all of the components required for the application. Once the build process is complete, all the files and executables necessary for a regional experiment are located in the ``regional_workflow`` and ``bin`` directories, respectively, under the ``ufs-srweather-app`` directory. -Users can utilize the pre-defined domains or build their own domain (details provided in `Chapter %s `). +Users can utilize the pre-defined domains or build their own domain (details provided in :numref:`Chapter %s `). In either case, users must create/modify the case-specific (``config.sh``) and/or grid-specific configuration files (``set_predef_grid_params.sh``). The overall procedure is shown in :numref:`Figure %s `, with the scripts to generate and run the workflow shown in red. The steps are as follows: @@ -33,11 +33,11 @@ Each step will be described in detail in the following sections. Download from GitHub ==================== -Retrieve the UFS Short Range Weather Application (SRW App) repository from GitHub and checkout the ``release/public-v1`` branch: +Retrieve the UFS Short Range Weather Application (SRW App) repository from GitHub and checkout the ``ufs-v1.0.0`` tag: .. code-block:: console - git clone -b release/public-v1 https://github.com/ufs-community/ufs-srweather-app.git + git clone -b ufs-v1.0.0 https://github.com/ufs-community/ufs-srweather-app.git cd ufs-srweather-app The cloned repository contains the configuration files and sub-directories shown in @@ -52,7 +52,7 @@ The cloned repository contains the configuration files and sub-directories shown +================================+========================================================+ | CMakeLists.txt | Main cmake file for SRW App | +--------------------------------+--------------------------------------------------------+ - | Externals.cfg | Hashes of the GitHub repositories/branches for the | + | Externals.cfg | Tags of the GitHub repositories/branches for the | | | external repositories | +--------------------------------+--------------------------------------------------------+ | LICENSE.md | CC0 license information | @@ -66,7 +66,7 @@ The cloned repository contains the configuration files and sub-directories shown +--------------------------------+--------------------------------------------------------+ | env | Contains build and workflow environment files | +--------------------------------+--------------------------------------------------------+ - | docs | Contains Release notes, documentation, and Users' Guide| + | docs | Contains release notes, documentation, and Users' Guide| +--------------------------------+--------------------------------------------------------+ | manage_externals | Utility for checking out external repositories | +--------------------------------+--------------------------------------------------------+ @@ -85,7 +85,7 @@ Check out the external repositories, including regional_workflow, ufs-weather-mo ./manage_externals/checkout_externals This step will use the configuration ``Externals.cfg`` file in the ``ufs-srweather-app`` directory to -clone the specific hashes (version of codes) of the external repositories as listed in +clone the specific tags (version of codes) of the external repositories as listed in :numref:`Section %s `. .. _BuildExecutables: @@ -114,16 +114,16 @@ The following steps will build the pre-processing utilities, forecast model, and make dir cd build cmake .. -DCMAKE_INSTALL_PREFIX=.. - make -j 8 >& build.out & + make -j 4 >& build.out & where ``-DCMAKE_INSTALL_PREFIX`` specifies the location in which the ``bin``, ``include``, ``lib``, and ``share`` directories containing various components of the SRW App will be created, and its recommended value ``..`` denotes one directory up from the build directory. In the next line for -the ``make`` call, ``-j 8`` indicates the build will run in parallel with 8 threads. If this step is successful, the -executables listed in :numref:`Table %s ` will be located in the +the ``make`` call, ``-j 4`` indicates the build will run in parallel with 4 threads. If this step is successful, the +executables listed in :numref:`Table %s ` will be located in the ``ufs-srweather-app/bin`` directory. -.. _exec_description: +.. _ExecDescription: .. table:: Names and descriptions of the executables produced by the build step and used by the SRW App. @@ -146,7 +146,7 @@ executables listed in :numref:`Table %s ` will be located in t +------------------------+---------------------------------------------------------------------------------+ | orog | Generates orography, land mask, and gravity wave drag files from fixed files | +------------------------+---------------------------------------------------------------------------------+ - | regional_esg_grid | Generates an ESG regional grid based on a user-defined namelist | + | regional_esg_grid | Generates an ESG regional grid based on a user-defined namelist | +------------------------+---------------------------------------------------------------------------------+ | sfc_climo_gen | Creates surface climatology fields from fixed files for use in ``chgres_cube`` | +------------------------+---------------------------------------------------------------------------------+ @@ -168,7 +168,8 @@ grids as shown in :numref:`Table %s `. Their names can be found ``valid_vals_PREDEF_GRID_NAME`` in the ``valid_param_vals`` script, and their grid-specific configuration variables are specified in the ``set_predef_grid_params`` script. If users want to create a new domain, they should put its name in the ``valid_param_vals`` script and the corresponding grid-specific -parameters in the ``set_predef_grid_params`` script. +parameters in the ``set_predef_grid_params`` script. More information on the predefined and user-generated options +can be found in :numref:`Chapter %s `. .. _PredefinedGrids: @@ -193,12 +194,13 @@ Default configuration: ``config_defaults.sh`` -------------------------------------------- When generating a new experiment (described in detail in :numref:`Section %s `), the ``config_defaults.sh`` file is read first and assigns default values to the experiment -parameters. Important configuration variables in the ``config_defaults.sh`` file are shown in +parameters. Important configuration variables in the ``config_defaults.sh`` file are shown in :numref:`Table %s `, with more documentation found in the file itself, and -in `Chapter %s `. Some of these default values are intentionally invalid in order +in :numref:`Chapter %s `. Some of these default values are intentionally invalid in order to ensure that the user assigns valid values in the user-specified configuration ``config.sh`` file. Therefore, any settings provided in ``config.sh`` will override the default ``config_defaults.sh`` settings. Note that there is usually no need for a user to modify the default configuration file. + .. _ConfigVarsDefault: .. table:: Configuration variables specified in the config_defaults.sh script. @@ -389,41 +391,16 @@ values in ``config_default.sh`` and the values defined in ``config.community.sh` Python Environment for Workflow =============================== -It is necessary to load the appropriate Python environment for the workflow. The workflow -requires Python 3, with the packages 'PyYAML', 'Jinja2', and 'f90nml' available. This Python -environment has already been set up on Level 1 platforms, and can be activated in the following way: - -On Cheyenne: - -.. code-block:: console - - module load ncarenv - ncar_pylib /glade/p/ral/jntp/UFS_SRW_app/ncar_pylib/regional_workflow - -Load the Rocoto module: +It is necessary to load the appropriate Python environment for the workflow. +The workflow requires Python 3, with the packages 'PyYAML', 'Jinja2', and 'f90nml' available. +This Python environment has already been set up on Level 1 platforms, and can be activated in +the following way: .. code-block:: console - module use -a /glade/p/ral/jntp/UFS_SRW_app/modules - module load rocoto - -On Hera and Jet: - -.. code-block:: console - - module use -a /contrib/miniconda3/modulefiles - module load miniconda3 - conda activate regional_workflow - module load rocoto - -On Orion: - -.. code-block:: console - - module use -a /apps/contrib/miniconda3-noaa-gsl/modulefiles - module load miniconda3 - conda activate regional_workflow + source ../../env/wflow_.env +when in the ``ufs-srweather-app/regional_workflow/ush`` directory. .. _GeneratingWflowExpt: @@ -492,10 +469,10 @@ delete these two *.db files and then call the launch script repeatedly for each +----------------------+------------------------------------------------------------+ | **Workflow Task** | **Task Description** | +======================+============================================================+ - | make_grid | Pre-processing task to generate regional grid files. Can | + | make_grid | Pre-processing task to generate regional grid files. Can | | | be run, at most, once per experiment. | +----------------------+------------------------------------------------------------+ - | make_orog | Pre-processing task to generate orography files. Can be | + | make_orog | Pre-processing task to generate orography files. Can be | | | run, at most, once per experiment. | +----------------------+------------------------------------------------------------+ | make_sfc_climo | Pre-processing task to generate surface climatology files. | @@ -522,6 +499,19 @@ There are two ways to launch the workflow using Rocoto: (1) with the ``launch_FV script, and (2) manually calling the ``rocotorun`` command. Moreover, you can run the workflow separately using stand-alone scripts. +An environment variable may be set to navigate to the ``$EXPTDIR`` more easily. If the login +shell is bash, it can be set as follws: + +.. code-block:: console + + export EXPTDIR=/path-to-experiment/directory + +Or if the login shell is csh/tcsh, it can be set using: + +.. code-block:: console + + setenv EXPTDIR /path-to-experiment/directory + Launch with the ``launch_FV3LAM_wflow.sh`` script ------------------------------------------------- To launch the ``launch_FV3LAM_wflow.sh`` script, simply call it without any arguments as follows: @@ -638,9 +628,9 @@ Rocoto software is not available on a given platform. These scripts are located a wrapper script to set environment variables and run the job script. Example batch-submit scripts for Hera (Slurm) and Cheyenne (PBS) are included: ``sq_job.sh`` -and ``qsub_job.sh``. These examples set the build and run environment for Hera or Cheyenne +and ``qsub_job.sh``, respectively. These examples set the build and run environment for Hera or Cheyenne so that run-time libraries match the compiled libraries (i.e. netcdf, mpi). Users may either -modify the one submit batch script as each task is submitted, or duplicate this batch wrapper +modify the submit batch script as each task is submitted, or duplicate this batch wrapper for their system settings for each task. Alternatively, some batch systems allow users to specify most of the settings on the command line (with the ``sbatch`` or ``qsub`` command, for example). This piece will be unique to your platform. The tasks run by the regional workflow @@ -651,7 +641,8 @@ be run concurrently (no dependency). .. table:: List of tasks in the regional workflow in the order that they are executed. Scripts with the same stage number may be run simultaneously. The number of - processors is typical for Cheyenne or Hera. + processors and wall clock time is a good starting point for Cheyenne or Hera + when running a 48-h forecast on the 25-km CONUS domain. +------------+------------------------+----------------+----------------------------+ | **Stage/** | **Task Run Script** | **Number of** | **Wall clock time (H:MM)** | @@ -673,7 +664,7 @@ be run concurrently (no dependency). +------------+------------------------+----------------+----------------------------+ | 4 | run_make_lbcs.sh | 48 | 0:30 | +------------+------------------------+----------------+----------------------------+ - | 5 | run_fcst.sh | 48 | 2:30 | + | 5 | run_fcst.sh | 48 | 0:30 | +------------+------------------------+----------------+----------------------------+ | 6 | run_post.sh | 48 | 0:25 (2 min per output | | | | | forecast hour) |