Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

trixi2vtk fails for TreeMesh results when running with MPI #1861

Closed
benegee opened this issue Mar 5, 2024 · 8 comments · Fixed by #1862
Closed

trixi2vtk fails for TreeMesh results when running with MPI #1861

benegee opened this issue Mar 5, 2024 · 8 comments · Fixed by #1862
Labels
bug Something isn't working parallelization Related to MPI, threading, tasks etc. visualization

Comments

@benegee
Copy link
Contributor

benegee commented Mar 5, 2024

I did a Trixi.jl simulation using a TreeMesh and AMR and SaveSolution callbacks. When I try to convert the output files using trixi2vtk, I get:

ERROR: KeyError: key "capacity" not found
Stacktrace:
  [1] getindex(x::HDF5.Attributes, name::String)
    @ HDF5 ~/.julia/packages/HDF5/Ws1wH/src/attributes.jl:374
  [2] #323
    @ ~/.julia/packages/Trixi/16b4d/src/meshes/mesh_io.jl:254 [inlined]
  [3] (::HDF5.var"#17#18"{HDF5.HDF5Context, @Kwargs{}, Trixi.var"#323#340", HDF5.File})()
    @ HDF5 ~/.julia/packages/HDF5/Ws1wH/src/file.jl:101
  [4] task_local_storage(body::HDF5.var"#17#18"{HDF5.HDF5Context, @Kwargs{}, Trixi.var"#323#340", HDF5.File}, key::Symbol, val::HDF5.HDF5Context)
    @ Base ./task.jl:297
  [5] #h5open#16
    @ ~/.julia/packages/HDF5/Ws1wH/src/file.jl:96 [inlined]
  [6] h5open
    @ ~/.julia/packages/HDF5/Ws1wH/src/file.jl:94 [inlined]
  [7] load_mesh_serial(mesh_file::String; n_cells_max::Int64, RealT::Type)
    @ Trixi ~/.julia/packages/Trixi/16b4d/src/meshes/mesh_io.jl:253
  [8] load_mesh_serial
    @ ~/.julia/packages/Trixi/16b4d/src/meshes/mesh_io.jl:246 [inlined]
  [9] macro expansion
    @ ~/.julia/packages/TimerOutputs/RsWnF/src/TimerOutput.jl:237 [inlined]
 [10] trixi2vtk(filename::String; format::Symbol, verbose::Bool, hide_progress::Bool, pvd::Nothing, output_directory::String, nvisnodes::Nothing, save_celldata::Bool, reinterpolate::Bool, data_is_uniform::Bool)
    @ Trixi2Vtk ~/.julia/packages/Trixi2Vtk/rtzWr/src/convert.jl:116
 [11] trixi2vtk(filename::String)
    @ Trixi2Vtk ~/.julia/packages/Trixi2Vtk/rtzWr/src/convert.jl:39
 [12] top-level scope
    @ REPL[2]:1

So, should the h5 files have the "capacity" attribute, or should Trixi not look for it?

It works for non-AMR simulations!

@ranocha
Copy link
Member

ranocha commented Mar 6, 2024

Can you please post the full stacktrace?

@benegee
Copy link
Contributor Author

benegee commented Mar 6, 2024

Sure! Just updated it in the original post.

@ranocha
Copy link
Member

ranocha commented Mar 6, 2024

Did you start with a clean state or may there be some other mesh/solution files?

@benegee
Copy link
Contributor Author

benegee commented Mar 6, 2024

I started the simulations from scratch and wrote to a fresh output directory each time.

@ranocha ranocha added the bug Something isn't working label Mar 6, 2024
@ranocha
Copy link
Member

ranocha commented Mar 6, 2024

Please provide the full set of information when describing a bug like this. In this specific case, I guess you ran the simulation with MPI? Does the same problem occur without AMR but with MPI?

@benegee
Copy link
Contributor Author

benegee commented Mar 6, 2024

Took a while to double check. It is related to MPI, but not to AMR!

I did the following on rocinante (Ubuntu 22.04) using julia 1.10.0, starting from a fresh directory:

JULIA_DEPOT_PATH=./julia-depot julia --project=.
julia> using Pkg
julia> Pkg.add(["Trixi", "Trixi2Vtk", "OrdinaryDiffEq", "MPI"])
julia> using MPI
julia> mpiexec() do cmd
         run(`$cmd -n 2 $(Base.julia_cmd()) --threads=1 --project=@. -e 'using Trixi; trixi_include(default_example())'`)
       end
julia> using Trixi2Vtk
julia> trixi2vtk("out/solution_*.h5")

@benegee benegee changed the title trixi2vtk fails for AMR result on TreeMesh trixi2vtk fails for TreeMesh results when running with MPI Mar 6, 2024
@benegee
Copy link
Contributor Author

benegee commented Mar 6, 2024

Maybe this change
#1748
just needs to be done for

function save_mesh_file(mesh::TreeMesh, output_directory, timestep, mpi_parallel::True)

as well?

@ranocha
Copy link
Member

ranocha commented Mar 6, 2024

Yes, that looks like it. Would you like to look into it and prepare a PR?

@ranocha ranocha transferred this issue from trixi-framework/Trixi2Vtk.jl Mar 6, 2024
@ranocha ranocha added visualization parallelization Related to MPI, threading, tasks etc. labels Mar 6, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working parallelization Related to MPI, threading, tasks etc. visualization
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants