Skip to content

Commit

Permalink
Merge pull request #43 from IMMM-SFA/antonia-had-patch-1-1
Browse files Browse the repository at this point in the history
Update 1_introduction.rst
  • Loading branch information
crvernon authored May 26, 2022
2 parents 2bc7ba2 + 1a03fa0 commit 5584140
Showing 1 changed file with 9 additions and 10 deletions.
19 changes: 9 additions & 10 deletions docs/source/1_introduction.rst
Original file line number Diff line number Diff line change
Expand Up @@ -16,17 +16,16 @@ This guidance text has been developed in support of the Integrated Multisector M

Addressing the objectives above poses a strong transdisciplinary challenge that depends on a diversity of models and, more specifically, a consistent framing for making model-based science inferences. The term transdisciplinary science as used here formally implies a deep integration of disciplines to aid our hypothesis-driven understanding of coupled human-natural systems--bridging differences in theory, hypothesis generation, modeling, and modes of inference :cite:p:`national2014convergence`. The IM3 MSD research foci and questions require a deep integration across disciplines, where new modes of analysis can emerge that rapidly synthesize and exploit advances for making decision-relevant insights that at minimum acknowledge uncertainty and more ideally promote a rigorous quantitative mapping of its effects on the generality of claimed scientific insights. More broadly, diverse scientific disciplines engaged in the science of coupled human-natural systems, ranging from natural sciences to engineering and economics, employ a diversity of numerical computer models to study and understand their underlying systems of focus. The utility of these computer models hinges on their ability to represent the underlying real systems with sufficient fidelity and enable the inference of novel insights. This is particularly challenging in the case of coupled human-natural systems where there exists a multitude of interdependent human and natural processes taking place that could potentially be represented. These processes usually translate into modeled representations that are highly complex, non-linear, and exhibit strong interactions and threshold behaviors :cite:p:`elsawah2020eight, haimes2018risk,helbing2013globally`. Model complexity and detail have also been increasing as a result of our improving understanding of these processes, the availability of data, and the rapid growth in computing power :cite:p:`saltelli2019so`. As model complexity grows, modelers need to specify a lot more information than before: additional model inputs and relationships as more processes are represented, higher resolution data as more observations are collected, new coupling relationships and interactions as diverse models are being used in combination to answer multisector questions (e.g., the land-water-energy nexus). Typically, not all of this information is well known, nor is the impact of these many uncertainties on model outputs well understood. It is further especially difficult to distinguish the effects of individual as well as interacting sources of uncertainty when modeling coupled systems with multisector and multiscale dynamics :cite:p:`wirtz2017rocky`.

Given the challenge and opportunity posed by the disciplinary diversity of IM3, we utilized a team-wide survey to allow the project’s membership to provide their views on how their areas typically address uncertainty, emphasizing key literature examples and domain-specific reviews. Our synthesis of this survey information in :numref:`Figure_1_1` summarizes the team’s perspectives, highlighting the commonalities and differences for how different disciplinary areas are typically addressing uncertainty. :numref:`Figure_1_1` highlights the non-trivial challenge posed by seeking to carefully consider uncertainty across an MSD focused transdisciplinary team. There are significant differences across the team’s contributing disciplines in terms of the methodological approaches and tools used in the treatment of uncertainty. The horizontal axis of the figure represents a conceptual continuum of methodological approaches, ranging from deterministic (no uncertainty) modeling to the theoretical case of fully engaging in modeling all sources of uncertainty. The vertical axis of plot maps the analysis tools that are used in the disciplines’ literature, spanning error-driven historical analyses to full uncertainty quantification. Given that :numref:`Figure_1_1` is a conceptual illustration, the mapping of each discipline’s boundaries is not meant to imply exactness. They encompass the scope of feedback attained in the team-wide survey responses. The color circles designate specific sources of uncertainty that could be considered. Within the mapped disciplinary approaches, the color circles distinguish those sources of uncertainty that are addressed in the bodies of literature reported by respondents. Note the complete absence of grey circles designating that, at present, few if any studies report results for understanding how model coupling relationships shape uncertainty.
Given the challenge and opportunity posed by the disciplinary diversity of IM3, we utilized an informal team-wide survey to understand how the various disciplines typically address uncertainty, emphasizing key literature examples and domain-specific reviews. The feedback received provided perspectives across diverse areas within the Earth sciences, different engineering fields, as well as economics. Although our synthesis of this survey information highlighted some commonality across areas (e.g., the frequent use of scenario-based modeling), we identified key differences in vocabulary, the frequency with which formal uncertainty analysis appears in the disciplinary literature, and technical approaches. The IM3 team’s responses captured a very broad conceptual continuum of methodological traditions, ranging from deterministic (no uncertainty) modeling to the theoretical case of fully engaging in modeling sources of uncertainty. Overall, error-driven analyses that focus on replicating prior observed conditions were reported to be the most prevalent types of studies for all disciplines. It was generally less common for studies to strongly engage with analyzing uncertainty via more formal ensemble analyses and design of experiments, though some areas did show significantly higher levels of activity. Another notable finding from our survey was the apparent lack of focus on understanding how model coupling relationships shape uncertainty. Although these observations are limited to the scope of feedback attained in the team-wide IM3 survey responses and the bodies of literature reported by respondents, we believe they reflect challenges that are common across the MSD community.

We can briefly distinguish the key terms of uncertainty quantification (UQ) and uncertainty characterization (UC). UQ refers to the formal focus on the full specification of likelihoods as well as distributional forms necessary to infer the joint probabilistic response across all modeled factors of interest :cite:p:`cooke1991experts`. Alternatively, uncertainty characterization as defined here, refers to exploratory modeling of alternative hypotheses for the co-evolutionary dynamics of influences, stressors, as well as path dependent changes in the form and function of modelled systems :cite:p:`moallemi2020exploratory,walker2003defining`. Uncertain factors are any model component that is affected by uncertainty: inputs, resolution levels, coupling relationships, model relationships and parameters. When a model has been established as a sufficiently accurate representation of the system some of these factors may reflect elements of the real-world system that the model represents (for example, a population level parameter would reflect a sufficiently accurate representation of the population level in the system under study). As discussed in later sections, the choice of UC or UQ depends on the specific goals of studies, the availability of data, the types of uncertainties (e.g., well-characterized or deep), the complexity of underlying models as well as computational limits. Deep uncertainty (as opposed to well-characterized) refers to situations where expert opinions consulted on a decision do not know or cannot agree on system boundaries, or the outcomes of interest and their relative importance, or the prior probability distributions for the various uncertain factors present :cite:p:`kwakkel2016coping,gass1997encyclopedia`.
In the IM3 uncertainty-related research that has occurred since this survey, we have observed that differences in terminology and interpretation of terminology across modeling teams can be confounding. One of the goals of this eBook is to provide a common language for uncertainty analysis within IM3 and, hopefully, for the broader MSD community. While individual scientific disciplines would be expected to retain their own terminology, by providing explicit definitions of terms we can facilitate the translation of concepts across transdisciplinary science teams. To begin, we use the term Uncertainty Analysis (UA) as an umbrella phrase covering all methods in this eBook. Next, we distinguish the key terms of uncertainty quantification (UQ) and uncertainty characterization (UC). UQ refers to the formal focus on the full specification of likelihoods as well as the distributional forms necessary to infer the joint probabilistic response across all modeled factors of interest :cite:p:`cooke1991experts`. UC refers to exploratory modeling of alternative hypotheses to understand the co-evolutionary dynamics of influences and stressors, as well as path dependent changes in the form and function of modelled systems :cite:p:`moallemi2020exploratory,walker2003defining`. As discussed in later sections, the choice of UC or UQ depends on the specific goals of studies, the availability of data, the types of uncertainties (e.g., well-characterized or deep), and the complexity of underlying models as well as computational limits. Definitions of key uncertainty analysis terms used in this eBook appear below, and our Glossary (:numref:`glossary`) contains a complete list of terms.

.. _Figure_1_1:
.. figure:: _static/figure1_1_state_of_the_science.png
:alt: Figure 1.1
:width: 700px
:figclass: margin-caption
:align: center

State-of-the-art in different modeling communities, as reported in the survey distributed to IM3 teams. *Deterministic Historical Evaluation*: model evaluation under fully determined conditions defined using historical observations; *Local Sensitivity Analysis*: model evaluation performed by varying uncertain factors around specific reference values; *Global Sensitivity Analysis*: model evaluation performed by varying uncertain factors throughout their entire feasible value space; *Uncertainty Characterization*: model evaluation under alternative factor hypotheses to explore their implications for model output uncertainty; *Uncertainty Quantification*: representation of model output uncertainty using probability distributions; *Traditional statistical inference*: use of analysis results to describe deterministic or probabilistic outcomes resulting from the presence of uncertainty; *Narrative scenarios*: use of a limited decision-relevant number of scenarios to describe (sets of) changing system outcomes; *Exploratory modeling for scenario discovery*: use of large ensembles of uncertain conditions to discover decision-relevant combinations of uncertain factors. Categories and colors refer to the same categories of uncertain factors listed in :numref:`Figure_2_1`.
* **Exploratory modeling**: Use of large ensembles of uncertain conditions to discover decision-relevant combinations of uncertain factors
* **Factor**: Any model component that can affect model outputs: inputs, resolution levels, coupling relationships, model relationships and parameters. In models with acceptable model fidelity these factors may represent elements of the real-world system under study.
* **Sensitivity analysis**: Model evaluation to understand the factors and processes that most (or least) control a model’s outputs
* **Local sensitivity analysis**: Varying uncertain factors around specific reference values
* **Global sensitivity analysis**: Varying uncertain factors throughout their entire feasible value space
* **Uncertainty characterization**: Model evaluation under alternative factor hypotheses to explore their implications for model output uncertainty
* **Uncertainty quantification**: Representation of model output uncertainty using probability distributions

At present, there is no singular guide for confronting the computational and conceptual challenges of the multi-model, transdisciplinary workflows that characterize ambitious projects such as IM3 :cite:p:`saltelli2015climate`. The primary aim of this text is to begin to address this gap and provide guidance for facing these challenges. :numref:`2_diagnostic_modeling` provides an overview of diagnostic modeling and the different perspectives for how we should evaluate our models, :numref:`3_sensitivity_analysis_the_basics` summarizes basic methods and concepts for sensitivity analysis, and :numref:`4_sensitivity_analysis` delves into more technical applications of sensitivity analysis to support diagnostic model evaluation and exploratory modeling. Finally, :numref:`5_conclusion` provides some concluding remarks across the UC and UQ topics covered in this text. The appendices of this text include a glossary of the key concepts, an overview of UQ methods, and coding-based illustrative examples of key UC concepts discussed in earlier chapters.

0 comments on commit 5584140

Please sign in to comment.