Benedict
Benedict
Yes, this works! Also not too many lines are affected: https://github.com/trixi-framework/Trixi.jl/commit/2393f83be3cf6672375b9b8781ad9c1295c90641
@vchuravy miraculously just created https://github.com/JuliaParallel/MPI.jl/pull/871 This could be used to fix the follow-up issues that appeared in #2054.
### Intermediate status I split this into smaller issues and PRs: - #2075 is an all new error, which we encountered during our tests when switching to aarch64. So far...
I am still unhappy with this fix. While `new_p4est` allows to just allocate user data, `load_p4est` requires to also provide the data. So in the end I am only saving...
The solution is to use `p4est_reset_data`! https://github.com/cburstedde/p4est/issues/308
> So with this, is it possible to restart a simulation and refine the initial condition? Just checked with this example here and it seems to work.
Thanks for your feedback! Following @JoshuaLampert 's comment above and https://github.com/trixi-framework/Trixi.jl/pull/1384#issuecomment-1505878188 I removed `ode_default_options()` in `elixir_advection_{extended,restart,restart_amr}.jl` for `p4est_2d_dgsem` and `tree_2d_dgsem`. In the new `elixir_advection_restart_amr.jl` for `p4est_2d_dgsem` and `tree_2d_dgsem` I set...
> Thanks! However, some test tolerances are not satisfied right now: https://github.com/trixi-framework/Trixi.jl/actions/runs/9398803634/job/25884924262?pr=1915#step:7:2874 Yes! I wanted to update them, but it seems I also introduced some issue in the MPI test....
It seems to work now!
I double-checked this: `init_boundaries!` is not called on the middle rank, neither for julia-provided MPI nor for system MPI. For both variants there are 26 calls of `p4est_iterate` in total...