Skip to content

Commit

Permalink
change some file names, and add more discussion. still around 1070 wo…
Browse files Browse the repository at this point in the history
…rds including acknowledgements
  • Loading branch information
rrsettgast committed May 16, 2024
1 parent a7e0518 commit 2d3ef6f
Show file tree
Hide file tree
Showing 6 changed files with 36 additions and 19 deletions.
File renamed without changes
File renamed without changes.
Binary file removed src/docs/JOSS/RW_results.png
Binary file not shown.
Binary file removed src/docs/JOSS/RW_results2.png
Binary file not shown.
23 changes: 21 additions & 2 deletions src/docs/JOSS/paper.bib
Original file line number Diff line number Diff line change
Expand Up @@ -57,9 +57,28 @@ @misc{hypre



@Misc{ petsc-web-page,
author = {Satish Balay and Shrirang Abhyankar and Mark~F. Adams and Steven Benson and Jed
Brown and Peter Brune and Kris Buschelman and Emil~M. Constantinescu and Lisandro
Dalcin and Alp Dener and Victor Eijkhout and Jacob Faibussowitsch and William~D.
Gropp and V\'{a}clav Hapla and Tobin Isaac and Pierre Jolivet and Dmitry Karpeev
and Dinesh Kaushik and Matthew~G. Knepley and Fande Kong and Scott Kruger and
Dave~A. May and Lois Curfman McInnes and Richard Tran Mills and Lawrence Mitchell
and Todd Munson and Jose~E. Roman and Karl Rupp and Patrick Sanan and Jason Sarich
and Barry~F. Smith and Stefano Zampini and Hong Zhang and Hong Zhang and Junchao
Zhang},
title = {{PETS}c {W}eb page},
url = {https://petsc.org/},
howpublished = {\url{https://petsc.org/}},
year = {2024}
}



@Manual{trilinos-website,
title = {The {T}rilinos {P}roject {W}ebsite},
author = {The {T}rilinos {P}roject {T}eam}},
year = {2020 (acccessed May 22, 2020)},
url = {https://trilinos.github.io}
}
@article{BUI:2020,
author = {Bui, Quan M. and Osei-Kuffuor, Daniel and Castelletto, Nicola and White, Joshua A.},
Expand Down
32 changes: 15 additions & 17 deletions src/docs/JOSS/paper.md
Original file line number Diff line number Diff line change
Expand Up @@ -93,61 +93,59 @@ bibliography: paper.bib
# Summary

GEOS is a simulation framework focused on implementing solution methods for tightly-coupled multi-physics problems with an initial emphasis subsurface reservoir applications.
Specifically, GEOS provides implementations for studying carbon sequestration, geothermal energy, hydrogen storage, and similar problems.
The unique aspect of GEOS that differentiates it from existing reservoir simulators is the ability to provide tightly-coupled compositional flow, poromechanics, faults and fractures, and thermal effects.
Currently GEOS provides implementations for studying carbon sequestration, geothermal energy, hydrogen storage, and similar subsurface applications.
The unique aspect of GEOS that differentiates it from existing reservoir simulators is the ability to provide tightly-coupled compositional flow, poromechanics, faults and fractures slip, and thermal effects.
Extensive documentation for GEOS is available at https://geosx-geosx.readthedocs-hosted.com/en/latest.
Note that the version of GEOS described here should be considered a separate work form the previous version of GEOS referred to in [@Settgast:2017].

# Statement of need

The increasing threat of climate change has resulted in an increased focus on mitigating carbon emissions into the atmosphere.
Carbon Capture and Storage (CCS) of CO2 in subsurface reservoirs and saline aquifers is one of the most important technologies required to meet global climate goals.
Carbon Capture and Storage (CCS) of CO2 in subsurface reservoirs and saline aquifers is an important technology required to meet global climate goals.
Given the 2050 net-zero GHG goals, CO2 storage capacities required to offset emissions is orders of magnitude greater than current levels.(reference needed)
One factor in the evaluation of CO2 storage sites are the containment risks associated with the injection of liquefied CO2 in the subsurface.
The primary goal of GEOS is to provide the global community with an open-source tool that is capable of simulating the complex coupled physics that occurs when liquefied CO2 is injected into a subsurface reservoir.
Thus, GEOS is freely available and focused on the simulation of reservoir integrity through various failure mechanisms such as caprock failure, fault leakage, and wellbore failure.

# C++ Infrastructure Components
# GEOS Components

The core c++17 infrastructure provides common computer science capabilities typically required for solving differential equations using a spatially discrete method.
The components of the infrastructure provided by GEOS include a data hierarchy, a discrete mesh data structure, a mesh based MPI communications interface, degree-of-freedom management, IO services, and a physics package interface.

The GEOS data repository forms an object hierarchy analogous with a classical folder/file hierarchy, where the "folder" is a `Group` object and the "file" is a `Wrapper` that is a container for any arbitrary object (e.g. scalar, array, class, etc.).
The mesh data structure is built on top of the data repository as a collection of "object managers" for each mesh object type (e.g. node, edge, face, element).
The management of distributed memory parallelism is done through a MPI, and the execution of distributed memory parallelism typically requires minimal consideration from the physics package developer.

GEOS is intended to be a generic multi-physics simulation platform.
The physics package interface in GEOS is intended to encapsulate the application of a numerical method to the solution of a collection of governing equations.
When implementing a package for a set of coupled physics equations, each individual physics package is first developed as a stand-alone capability.
Then the single physics capabilities are utilized together in a coupled physics package.
When implementing a physics package for a set of coupled physics equations, each individual physics package is first developed as a stand-alone capability.
The single physics capabilities are then applied together in a coupled physics package.
The strategy for coupled physics can be described in terms of a monolithic linear system with an underlying block structure corresponding where the row/col of the block corresponds with a set of constraint equations/degrees-of-freedom associated with a single physics package, with the row being the constraint equation, and the column corresponding to the degree-of-freedom.
Using this representation, the diagonal blocks of the matrix contain contributions for each single physics package to its own boundary value problem, while the off-diagonal blocks represent the coupling between physics packages.
The coupled physics package is often responsible for providing the specific contributions of the off-diagonal/coupling blocks.

To solve these linear systems, GEOS maintains a generic linear algebra interface (LAI) capable of wrapping various linear algebra packages.
However, the primary linear algebra package used for the great majority of GEOS simulations is LLNL's hypre[@hypre].
To solve these linear systems, GEOS maintains a generic linear algebra interface (LAI) capable of wrapping various linear algebra packages such as hypre [@hypre], PETSc[@petsc-web-page], and Trilinos[@trilinos-website].
Currently only the hypre interaface is actively maintained.
For multi-physics problems involving the solution of a coupled linear system, GEOS exclusively relies on hypre's implementation a multi-grid reduction preconditioning strategy as presented by [@BUI:2020],[@BUI:2021114111].

The performance portability strategy utilized by GEOS applies LLNL's suite of portability tools RAJA[@Beckingsale:2019], CHAI[@CHAI:2023], and Umpire[@Beckingsale:2020].
The RAJA performance portability layer provides portable kernel launching and wrappers for reductions, atomics, and local/shared memory to achieve performance on both CPU and GPU hardware.
The combination of CHAI/Umpire provides memory motion management for platforms with heterogeneous memory spaces (i.e. host memory and device memory).
Through this strategy GEOS has been successfully run on platforms ranging from GPU-based Exa-scale systems to CPU-based laptops with minimal loss of performance due to platform changes.
Through this strategy GEOS has been successfully run on platforms ranging from GPU-based Exa-scale systems to CPU-based laptops with near optimal of performance.

In addition to its c++ core, the the GEOS team provides a Python3 interface that allows for the integration of the simulation capabilities into complex python workflows involving components unrelated to GEOS.
The Python3 interface provides data exchange between GEOS simulations and the Python driver, as well as allowing the Python layer to call specific GEOS packages outside of standard GEOS c++ workflow.


# Applications
To date GEOS has been used to simulate problems relevant to CO2 storage, enhanced geothermal systems, hydrogen storage, and both conventional and unconventional oil and gas extraction.
Often these simulations involve coupling between compositional multiphase flow and transport, poroelasticity, thermal transport, and interactions with faults and fractures.

As an example of a field case where GEOS has been applied, we present a simulation of CO2 storage at a large real-world storage site.
Figure \ref{RW_mesh} illustrates the computational mesh and relevant problem size and physical dimensions.
Figure \ref{RW_mesh} illustrates the computational mesh and relevant problem size and physical dimensions. Results

![Discrete mesh of a real world CO2 storage site. Transparency is used for the overburden region to reveal the complex faulted structure of the storage reservoir.\label{RW_mesh}](RW_mesh.png){ width=80% }

![Results of a compositional flow simulation of a real world CO2 storage site.\label{RW_results}](RW_results2.png){ width=80% }

![Results of a compositional flow simulation of a real world CO2 storage site after 25 years of CO2 injection.\label{RW_results}](RW_results.pdf){ width=80% }

A large, real, model after 25 years of CO2 injection. The CO2 plume is at the center of the
model. Below the plume, colors indicate pressure changes. Above the plume, colors indicate the amount
of vertical displacement caused by the injection. Color scales have been removed intentionally

As an example of the weak scalability of GEOS on exascale systems, we present two weak scaling studies on a simple wellbore geometry using the exascale Frontier supercomputer located at Oak Ridge National Laboratory.
The results from the weak scaling study (Figure \ref{fig:Frontier_Mechanics}) shows flat scaling of the GEOS processes (assembly/field synchronization) up to 16,384 MPI ranks and 81.3e9 degrees-of-freedom (1/4 of Frontier).
Expand Down

0 comments on commit 2d3ef6f

Please sign in to comment.