Skip to content

Commit

Permalink
Finalizing figures
Browse files Browse the repository at this point in the history
  • Loading branch information
castelletto1 committed May 16, 2024
1 parent 9efe862 commit 6521858
Show file tree
Hide file tree
Showing 5 changed files with 164 additions and 8 deletions.
Binary file modified src/docs/JOSS/GEOS_Frontier_compositional.pdf
Binary file not shown.
Binary file modified src/docs/JOSS/GEOS_Frontier_mechanics.pdf
Binary file not shown.
Binary file added src/docs/JOSS/GEOS_Frontier_scaling.pdf
Binary file not shown.
160 changes: 160 additions & 0 deletions src/docs/JOSS/GEOS_Frontier_scaling.svg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
12 changes: 4 additions & 8 deletions src/docs/JOSS/paper.md
Original file line number Diff line number Diff line change
Expand Up @@ -143,18 +143,14 @@ Simulations such as these play a critical role in predicting the performance of
![Real world CO2 storage site: (a) discrete mesh, transparency is used for the overburden region to reveal the complex faulted structure of the storage reservoir; (b) results of a compositional flow simulation after 25 years of CO2 injection. The CO2 plume is shown in white near the bottom of the well. Colors in the reservoir layer indicate changes in fluid pressure, and the colors in the overburden indicate vertical displacement resulting from the injection. Note that color scales have been removed intentionally.\label{RW_results}](RW_final.pdf){ width=1oo% }


As an example of the weak scalability of GEOS on exascale systems, we present two weak scaling studies on a simple wellbore geometry using the exascale Frontier supercomputer located at Oak Ridge National Laboratory.
The results from the weak scaling study (Figure \ref{fig:Frontier_Mechanics}) shows flat scaling of the GEOS processes (assembly/field synchronization) up to 16,384 MPI ranks and 81.3e9 degrees-of-freedom (1/4 of Frontier).
As an example of the weak scalability of GEOS on exascale systems, we present two weak scaling studies on a simple wellbore geometry using the exascale Frontier supercomputer located at Oak Ridge National Laboratory (ORNL).
The results from the weak scaling study (Figure \ref{fig:Frontier_Mechanics}a) shows flat scaling of the GEOS processes (assembly/field synchronization) up to 16,384 MPI ranks and 81.3e9 degrees-of-freedom (1/4 of Frontier).
There is a moderate decrease in efficiency with the application of the hypre preconditioner setup and solve, but given the complexity of those algorithms this level of scaling efficiency is expected.
The compositional flow study presented in Figure \ref{fig:Frontier_CMPF} shows similarly good weak scaling.
The compositional flow study presented in Figure \ref{fig:Frontier_scaling}b shows similarly good weak scaling.

![Weak scaling results for mechanics on ORNL/Frontier.\label{fig:Frontier_Mechanics}](GEOS_Frontier_mechanics.pdf){ width=50% }

![Weak scaling results for compositional flow on ORNL/Frontier.\label{fig:Frontier_CMPF}](GEOS_Frontier_compositional.pdf){ width=40% }
![Weak scaling results on ORNL/Frontier: execution time per timestep vs number of cluster ranks for a mechanics (a) and a compositional flow (b) simulation, respectively.\label{fig:Frontier_scaling}](GEOS_Frontier_scaling.pdf){ width=100% }

# Acknowledgements


This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344

This research was supported by the Exascale Computing Project (ECP), Project Number: 17-SC-20-SC, a collaborative effort of two DOE organizations - the Office of Science and the National Nuclear Security Administration, responsible for the planning and preparation of a capable exascale ecosystem, including software, applications, hardware, advanced system engineering and early testbed platforms, to support the nation's exascale computing imperative.
Expand Down

0 comments on commit 6521858

Please sign in to comment.