Skip to content

3. Methods

Christina Bandaragoda edited this page Feb 27, 2018 · 11 revisions

In our improved process, any researcher with Python coding skills will be able to efficiently generate updating hydrometeorology forcings using local datasets, locally validated assumptions, anywhere in the world, for any time period of interest. We propose a 4-step research plan:

Observed and modeled data

The first step was to download, store, and format available observed and modeled hydrometeorology time series. Long term climate stations and WRF gridded hydrometeology (Salathe et al., 2014) will be used to assess the lapse rates between the low elevation observations and the long term physics based monthly averages at high elevations.

Explain using Livneh code to handle spatial interpolation and wind.

Explain using existing DHSVM model for the Skagit

Build physics-based lapse rate component

The second step was to develop a Landlab component that can flexibly reproduce a grid of interpolated hydrometeorology variables. This included using empirical and theoretical relationships to derive high elevation correlations between precipitation, temperature and radiation using observations at lower elevations.

Optimize model parameters

We used DREAM multiobjective optimization in Dakota to calibrate empirical parameters. This will generate a gridded time series of hydromet variables that match WRF long term averages, updated to near real time, and with correlations that can be used to generate long time series (1915-2099) based on corrections to existing gridded datasets (Livneh et al., 2013; MACA: Abatzoglou and Brown, 2012).

Step 1. Run FAST with 10 sensitive parameters (based on Kelleher et al., 2017) to determine which parameters are sensitive in this watershed, for this storm, using a sample climate forcing (LivWRFbc_livlow).

Table 1. FAST Test 1 parameter input with wide apriori uniform distribution

ParamID parameter name Benchmark Min Max
1 Temperature Lapse Rate -0.0048 -0.0025 -0.008
2 Precipitation Lapse Rate 0.0006 0.0001 0.001
3 Precipitation Multiplier 0.00001 0.000005 0.00003
4 Lateral Conductivity 61 0.0017 0.00001 0.01
5 Exponential Decrease 61 0.5 0.25 2.5
6 Lateral Conductivity 62 0.00017 0.00001 0.01
7 Exponential Decrease 62 0.5 0.25 2.5
8 Maximum Resistance 9 3000 500 3000
9 Minimum Resistance 9 250 150 300
10 Overstory Monthly LAI 9 0.5 0.5 6

Step 2. Run FAST with 10 sensitive parameters (based on Kelleher et al., 2017) updated with results from Step 1 to determine which parameters are sensitive in this watershed, for this storm, using four experimental gridded climate inputs.

Table 2. FAST Test 2 parameter input with wide apriori uniform distribution

ParamID parameter name Benchmark Min Max
1 Temperature Lapse Rate -0.0048 -0.0025 -0.008
2 Precipitation Lapse Rate 0.0006 0.0001 0.001
3 Precipitation Multiplier 0.00001 0.000005 0.00003
4 Rain Threshold -2.0 -3.0 -1.0
5 Snow Threshold 0.0 -1.0 1.0
6 Lateral Conductivity 62 0.00017 0.00001 0.01
7 Exponential Decrease 62 0.5 0.25 2.5
8 Maximum Resistance 9 3000 500 3000
9 Minimum Resistance 9 250 150 300
10 Overstory Monthly LAI 9 0.5 0.5 6

Step 3. Run Decision Tree on FAST results from Step 2 to narrow apriori search space.

Table 3. FAST Test 3 parameter input with narrowed apriori uniform distribution based on results from four climate datasets.

 wide wide narrow narrow
ParamID parameter name Benchmark Min Max Min Max
1 Temperature Lapse Rate -0.0048 -0.008 -0.0025 -0.007 (raw_liv) -0.005 (raw_wrf)
2 Precipitation Lapse Rate 0.0006 0.0001 0.001
3 Precipitation Multiplier 0.00001 0.000005 0.00003
4 Rain Threshold -2.0 -3.0 -1.0
5 Snow Threshold 0.0 -1.0 1.0
6 Lateral Conductivity 62 0.00017 0.00001 0.01 0.001 (raw_liv, raw_wrf)
7 Exponential Decrease 62 0.5 0.25 2.5 1.25 (raw_liv,raw_wrf)
8 Maximum Resistance 9 3000 500 3000
9 Minimum Resistance 9 250 150 300

Cloud environment

Steps 1-3 were executed in a cloud environment. The Consortium of Universities Allied for Hydrologic Science (CUAHSI) has a docker image of the software environment needed for us to use the proposed tools. We propose to work with UW Cloud Computing to launch the process from HydroShare, using a dockerized image on a commercial cloud platform to compare the costs and benefits between using CyberGIS (currently used), Azure and Amazon Web Services.

Setting up for batch runs: Hyak:

git clone https://github.com/ChristinaB/Incubating-a-DREAM.git git clone https://github.com/thouska/spotpy.git export PYTHONPATH="${PYTHONPATH}:/civil/shared/ecohydrology/SkagitSauk/DHSVM-Glacier/DHSVM/spotpy" python import spotpy

Verification of consistency between cloud environments

We used the Sauk 2006 Flood test dataset to verify that we have consistent model outputs on different machines. The results of the UW Hyak cluster and HydroShare server are available in the Incubating-a-Dream repository (Folder: Sauk_DHSVM_modeldata; File: Mass.Final.Balance.Hyak and Mass.Final.Balance.HydroShare)

The model results are identical to two significant digits. For example, Total Inflow difference is 0.002 mm; 713.325 mm for results calculated on Hyak and 713.323 mm for results calculated on HydroShare. Mass error (mm) is 0.425 for results on both machines.

HydroShare runtime Summary is 0.13 hours elapsed for the simulation period of 1443 hours (60.1 days). Hyak runtime summary is 0.07 hours elapsed for the simulation period of 1443 hours (60.1 days).

The Sauk 2006 Flood dataset used for this server comparison is available at Sauk River Basin DHSVM model instance 2017

Experimental Design

Reduce streamflow (Q) residuals using DREAM Test lapse rate (T, P) sensitivity and optimization Target audience - flood and sediment modelers working in mountain watersheds for disaster prediction Focused study - peak events, 3hrly model timestep

Test 1: Use DREAM to optimize model with constant lapse rates within range [3-7 C] + 7 other parameters

Test 2: Use DREAM optimized 7 parameters with daily spatially variable lapse rates

Test 3: Use DREAM to optimize model with 7 parameters and daily spatially variable lapse rates