Skip to content

Commit

Permalink
Merge branch 'main' into sc/converters_coupling
Browse files Browse the repository at this point in the history
  • Loading branch information
SimonCan authored Nov 21, 2023
2 parents 8b6f7b3 + 3c1762b commit cf262a8
Show file tree
Hide file tree
Showing 15 changed files with 783 additions and 73 deletions.
1 change: 1 addition & 0 deletions NEWS.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@ for human readability.
## Changes when updating to v0.6 from v0.5.x

#### Added
- AMR for hyperbolic-parabolic equations on 2D `P4estMesh`

#### Changed

Expand Down
4 changes: 2 additions & 2 deletions Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -66,7 +66,7 @@ Makie = "0.19"
MuladdMacro = "0.2.2"
Octavian = "0.3.5"
OffsetArrays = "1.3"
P4est = "0.4"
P4est = "0.4.9"
Polyester = "0.7.5"
PrecompileTools = "1.1"
Printf = "1"
Expand All @@ -84,7 +84,7 @@ StaticArrays = "1"
StrideArrays = "0.1.18"
StructArrays = "0.6"
SummationByPartsOperators = "0.5.41"
T8code = "0.4.1"
T8code = "0.4.3"
TimerOutputs = "0.5"
Triangulate = "2.0"
TriplotBase = "0.1"
Expand Down
67 changes: 48 additions & 19 deletions docs/src/parallelization.md
Original file line number Diff line number Diff line change
Expand Up @@ -53,30 +53,43 @@ a system-provided MPI installation with Trixi.jl can be found in the following s

### [Using a system-provided MPI installation](@id parallel_system_MPI)

When using Trixi.jl with a system-provided MPI backend the underlying
[`p4est`](https://github.com/cburstedde/p4est) and [`t8code`](https://github.com/DLR-AMR/t8code)
libraries need to be compiled with the same MPI installation. Therefore, you also need to
use system-provided `p4est` and `t8code` installations (for notes on how to install `p4est`
and `t8code` see e.g. [here](https://github.com/cburstedde/p4est/blob/master/README) and
[here](https://github.com/DLR-AMR/t8code/wiki/Installation), use the configure option
`--enable-mpi`). Note that `t8code` already comes with a `p4est` installation, so it suffices
to install `t8code`. In addition, [P4est.jl](https://github.com/trixi-framework/P4est.jl) and
[T8code.jl](https://github.com/DLR-AMR/T8code.jl) need to be configured to use the custom
installations. Follow the steps described
[here](https://github.com/DLR-AMR/T8code.jl/blob/main/README.md#installation) and
[here](https://github.com/trixi-framework/P4est.jl/blob/main/README.md#installation) for the
configuration. The paths that point to `libp4est.so` (and potentially to `libsc.so`) need to be
the same for P4est.jl and T8code.jl. This could e.g. be `libp4est.so` that usually can be found
in `lib/` or `local/lib/` in the installation directory of `t8code`.
In total, in your active Julia project you should have a LocalPreferences.toml file with sections
`[MPIPreferences]`, `[T8code]` and `[P4est]` as well as an entry `MPIPreferences` in your
Project.toml to use a custom MPI installation. A `LocalPreferences.toml` file
When using Trixi.jl with a system-provided MPI backend, the underlying
[`p4est`](https://github.com/cburstedde/p4est), [`t8code`](https://github.com/DLR-AMR/t8code)
and [`HDF5`](https://github.com/HDFGroup/hdf5) libraries need to be compiled with the same MPI
installation. If you want to use `p4est` (via the `P4estMesh`) or `t8code` (via the `T8codeMesh`)
from Trixi.jl, you also need to use system-provided `p4est` or `t8code` installations
(for notes on how to install `p4est` and `t8code` see, e.g., [here](https://github.com/cburstedde/p4est/blob/master/README)
and [here](https://github.com/DLR-AMR/t8code/wiki/Installation), use the configure option
`--enable-mpi`). Otherwise, there will be warnings that no preference is set for P4est.jl and
T8code.jl that can be ignored if you do not use these libraries from Trixi.jl. Note that
`t8code` already comes with a `p4est` installation, so it suffices to install `t8code`.
In order to use system-provided `p4est` and `t8code` installations, [P4est.jl](https://github.com/trixi-framework/P4est.jl)
and [T8code.jl](https://github.com/DLR-AMR/T8code.jl) need to be configured to use the custom
installations. Follow the steps described [here](https://github.com/DLR-AMR/T8code.jl/blob/main/README.md#installation) and
[here](https://github.com/trixi-framework/P4est.jl/blob/main/README.md#installation).
for the configuration. The paths that point to `libp4est.so` (and potentially to `libsc.so`) need to be
the same for P4est.jl and T8code.jl. This could, e.g., be `libp4est.so` that usually can be found
in `lib/` or `local/lib/` in the installation directory of `t8code`. Note that the `T8codeMesh`, however,
does not support MPI yet.
The preferences for [HDF5.jl](https://github.com/JuliaIO/HDF5.jl) always need to be set, even if you
do not want to use `HDF5` from Trixi.jl, see also https://github.com/JuliaIO/HDF5.jl/issues/1079.
To set the preferences for HDF5.jl, follow the instructions described
[here](https://trixi-framework.github.io/Trixi.jl/stable/parallelization/#Using-parallel-input-and-output).

In total, in your active Julia project you should have a `LocalPreferences.toml` file with sections
`[MPIPreferences]`, `[T8code]` (only needed if `T8codeMesh` is used), `[P4est]` (only needed if
`P4estMesh` is used), and `[HDF5]` as well as an entry `MPIPreferences` in your
`Project.toml` to use a custom MPI installation. A `LocalPreferences.toml` file
created as described above might look something like the following:
```toml
[HDF5]
libhdf5 = "/usr/lib/x86_64-linux-gnu/hdf5/openmpi/libhdf5.so"
libhdf5_hl = "/usr/lib/x86_64-linux-gnu/hdf5/openmpi/libhdf5_hl.so"

[HDF5_jll]
libhdf5_hl_path = "/usr/lib/x86_64-linux-gnu/hdf5/openmpi/libhdf5_hl.so"
libhdf5_path = "/usr/lib/x86_64-linux-gnu/hdf5/openmpi/libhdf5.so"

[MPIPreferences]
__clear__ = ["preloads_env_switch"]
_format = "1.0"
Expand All @@ -97,6 +110,22 @@ libsc = "/home/mschlott/hackathon/libtrixi/t8code/install/lib/libsc.so"
libt8 = "/home/mschlott/hackathon/libtrixi/t8code/install/lib/libt8.so"
```

This file is created with the following sequence of commands:
```julia
julia> using MPIPreferences
julia> MPIPreferences.use_system_binary()
```
Restart the Julia REPL
```julia
julia> using P4est
julia> P4est.set_library_p4est!("/home/mschlott/hackathon/libtrixi/t8code/install/lib/libp4est.so")
julia> P4est.set_library_sc!("/home/mschlott/hackathon/libtrixi/t8code/install/lib/libsc.so")
julia> using T8code
julia> T8code.set_libraries_path!("/home/mschlott/hackathon/libtrixi/t8code/install/lib/")
julia> using HDF5
julia> HDF5.API.set_libraries!("/usr/lib/x86_64-linux-gnu/hdf5/openmpi/libhdf5.so", "/usr/lib/x86_64-linux-gnu/hdf5/openmpi/libhdf5_hl.so")
```
After the preferences are set, restart the Julia REPL again.

### [Usage](@id parallel_usage)

Expand Down Expand Up @@ -218,7 +247,7 @@ julia> HDF5.API.set_libraries!("/path/to/your/libhdf5.so", "/path/to/your/libhdf
```
For more information see also the
[documentation of HDF5.jl](https://juliaio.github.io/HDF5.jl/stable/mpi/). In total, you should
have a file called LocalPreferences.toml in the project directory that contains a section
have a file called `LocalPreferences.toml` in the project directory that contains a section
`[MPIPreferences]`, a section `[HDF5]` with entries `libhdf5` and `libhdf5_hl`, a section `[P4est]`
with the entry `libp4est` as well as a section `[T8code]` with the entries `libt8`, `libp4est`
and `libsc`.
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,98 @@
using OrdinaryDiffEq
using Trixi

###############################################################################
# semidiscretization of the linear advection-diffusion equation

diffusivity() = 5.0e-2
advection_velocity = (1.0, 0.0)
equations = LinearScalarAdvectionEquation2D(advection_velocity)
equations_parabolic = LaplaceDiffusion2D(diffusivity(), equations)

# Create DG solver with polynomial degree = 3 and (local) Lax-Friedrichs/Rusanov flux as surface flux
solver = DGSEM(polydeg = 3, surface_flux = flux_lax_friedrichs)

coordinates_min = (-1.0, -0.5) # minimum coordinates (min(x), min(y))
coordinates_max = (0.0, 0.5) # maximum coordinates (max(x), max(y))

trees_per_dimension = (4, 4)
mesh = P4estMesh(trees_per_dimension,
polydeg = 3, initial_refinement_level = 2,
coordinates_min = coordinates_min, coordinates_max = coordinates_max,
periodicity = false)

# Example setup taken from
# - Truman Ellis, Jesse Chan, and Leszek Demkowicz (2016).
# Robust DPG methods for transient convection-diffusion.
# In: Building bridges: connections and challenges in modern approaches
# to numerical partial differential equations.
# [DOI](https://doi.org/10.1007/978-3-319-41640-3_6).
function initial_condition_eriksson_johnson(x, t, equations)
l = 4
epsilon = diffusivity() # TODO: this requires epsilon < .6 due to sqrt
lambda_1 = (-1 + sqrt(1 - 4 * epsilon * l)) / (-2 * epsilon)
lambda_2 = (-1 - sqrt(1 - 4 * epsilon * l)) / (-2 * epsilon)
r1 = (1 + sqrt(1 + 4 * pi^2 * epsilon^2)) / (2 * epsilon)
s1 = (1 - sqrt(1 + 4 * pi^2 * epsilon^2)) / (2 * epsilon)
u = exp(-l * t) * (exp(lambda_1 * x[1]) - exp(lambda_2 * x[1])) +
cos(pi * x[2]) * (exp(s1 * x[1]) - exp(r1 * x[1])) / (exp(-s1) - exp(-r1))
return SVector{1}(u)
end
initial_condition = initial_condition_eriksson_johnson

boundary_conditions = Dict(:x_neg => BoundaryConditionDirichlet(initial_condition),
:y_neg => BoundaryConditionDirichlet(initial_condition),
:y_pos => BoundaryConditionDirichlet(initial_condition),
:x_pos => boundary_condition_do_nothing)

boundary_conditions_parabolic = Dict(:x_neg => BoundaryConditionDirichlet(initial_condition),
:x_pos => BoundaryConditionDirichlet(initial_condition),
:y_neg => BoundaryConditionDirichlet(initial_condition),
:y_pos => BoundaryConditionDirichlet(initial_condition))

# A semidiscretization collects data structures and functions for the spatial discretization
semi = SemidiscretizationHyperbolicParabolic(mesh,
(equations, equations_parabolic),
initial_condition, solver;
boundary_conditions = (boundary_conditions,
boundary_conditions_parabolic))

###############################################################################
# ODE solvers, callbacks etc.

# Create ODE problem with time span `tspan`
tspan = (0.0, 0.5)
ode = semidiscretize(semi, tspan)

# At the beginning of the main loop, the SummaryCallback prints a summary of the simulation setup
# and resets the timers
summary_callback = SummaryCallback()

# The AnalysisCallback allows to analyse the solution in regular intervals and prints the results
analysis_interval = 1000
analysis_callback = AnalysisCallback(semi, interval = analysis_interval)

# The AliveCallback prints short status information in regular intervals
alive_callback = AliveCallback(analysis_interval = analysis_interval)

amr_controller = ControllerThreeLevel(semi, IndicatorMax(semi, variable = first),
base_level = 1,
med_level = 2, med_threshold = 0.9,
max_level = 3, max_threshold = 1.0)

amr_callback = AMRCallback(semi, amr_controller,
interval = 50)

# Create a CallbackSet to collect all callbacks such that they can be passed to the ODE solver
callbacks = CallbackSet(summary_callback, analysis_callback, alive_callback, amr_callback)

###############################################################################
# run the simulation

# OrdinaryDiffEq's `solve` method evolves the solution in time and executes the passed callbacks
time_int_tol = 1.0e-11
sol = solve(ode, dt = 1e-7, RDPK3SpFSAL49(); abstol = time_int_tol, reltol = time_int_tol,
ode_default_options()..., callback = callbacks)

# Print the timer summary
summary_callback()
83 changes: 83 additions & 0 deletions examples/p4est_2d_dgsem/elixir_advection_diffusion_periodic_amr.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,83 @@
using OrdinaryDiffEq
using Trixi

###############################################################################
# semidiscretization of the linear advection-diffusion equation

advection_velocity = (1.5, 1.0)
equations = LinearScalarAdvectionEquation2D(advection_velocity)
diffusivity() = 5.0e-2
equations_parabolic = LaplaceDiffusion2D(diffusivity(), equations)

# Create DG solver with polynomial degree = 3 and (local) Lax-Friedrichs/Rusanov flux as surface flux
solver = DGSEM(polydeg = 3, surface_flux = flux_lax_friedrichs)

coordinates_min = (-1.0, -1.0) # minimum coordinates (min(x), min(y))
coordinates_max = (1.0, 1.0) # maximum coordinates (max(x), max(y))

trees_per_dimension = (4, 4)
mesh = P4estMesh(trees_per_dimension,
polydeg = 3, initial_refinement_level = 1,
coordinates_min = coordinates_min, coordinates_max = coordinates_max)

# Define initial condition
function initial_condition_diffusive_convergence_test(x, t,
equation::LinearScalarAdvectionEquation2D)
# Store translated coordinate for easy use of exact solution
x_trans = x - equation.advection_velocity * t

nu = diffusivity()
c = 1.0
A = 0.5
L = 2
f = 1 / L
omega = 2 * pi * f
scalar = c + A * sin(omega * sum(x_trans)) * exp(-2 * nu * omega^2 * t)
return SVector(scalar)
end
initial_condition = initial_condition_diffusive_convergence_test

# A semidiscretization collects data structures and functions for the spatial discretization
semi = SemidiscretizationHyperbolicParabolic(mesh,
(equations, equations_parabolic),
initial_condition, solver)

###############################################################################
# ODE solvers, callbacks etc.

# Create ODE problem with time span `tspan`
tspan = (0.0, 0.5)
ode = semidiscretize(semi, tspan);

# At the beginning of the main loop, the SummaryCallback prints a summary of the simulation setup
# and resets the timers
summary_callback = SummaryCallback()

# The AnalysisCallback allows to analyse the solution in regular intervals and prints the results
analysis_interval = 100
analysis_callback = AnalysisCallback(semi, interval = analysis_interval)

# The AliveCallback prints short status information in regular intervals
alive_callback = AliveCallback(analysis_interval = analysis_interval)

amr_controller = ControllerThreeLevel(semi, IndicatorMax(semi, variable = first),
base_level = 1,
med_level = 2, med_threshold = 1.25,
max_level = 3, max_threshold = 1.45)

amr_callback = AMRCallback(semi, amr_controller,
interval = 20)

# Create a CallbackSet to collect all callbacks such that they can be passed to the ODE solver
callbacks = CallbackSet(summary_callback, analysis_callback, alive_callback, amr_callback)

###############################################################################
# run the simulation

# OrdinaryDiffEq's `solve` method evolves the solution in time and executes the passed callbacks
time_int_tol = 1.0e-11
sol = solve(ode, RDPK3SpFSAL49(); abstol = time_int_tol, reltol = time_int_tol,
ode_default_options()..., callback = callbacks)

# Print the timer summary
summary_callback()
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,6 @@ heat_bc = Adiabatic((x, t, equations) -> 0.0)
boundary_condition_lid = BoundaryConditionNavierStokesWall(velocity_bc_lid, heat_bc)
boundary_condition_cavity = BoundaryConditionNavierStokesWall(velocity_bc_cavity, heat_bc)

# define periodic boundary conditions everywhere
boundary_conditions = Dict(:x_neg => boundary_condition_slip_wall,
:y_neg => boundary_condition_slip_wall,
:y_pos => boundary_condition_slip_wall,
Expand Down
Loading

0 comments on commit cf262a8

Please sign in to comment.