MPI p4est parabolic 2D nonconforming AMR with loadbalancing#2888
Conversation
Included suggested style improvments Co-authored-by: Daniel Doehring <doehringd2@gmail.com>
update Naming to indicator_function Co-authored-by: Daniel Doehring <doehringd2@gmail.com>
changed naming Co-authored-by: Daniel Doehring <doehringd2@gmail.com>
changed naming Co-authored-by: Daniel Doehring <doehringd2@gmail.com>
changed naming Co-authored-by: Daniel Doehring <doehringd2@gmail.com>
Co-authored-by: Daniel Doehring <doehringd2@gmail.com>
Co-authored-by: Daniel Doehring <doehringd2@gmail.com>
Co-authored-by: Daniel Doehring <doehringd2@gmail.com>
…nging IndicatorMax with equivalent IndicatorPositional
…jl into IndicatorPositional
Review checklistThis checklist is meant to assist creators of PRs (to let them know what reviewers will typically look for) and reviewers (to guide them in a structured review process). Items do not need to be checked explicitly for a PR to be eligible for merging. Purpose and scope
Code quality
Documentation
Testing
Performance
Verification
Created with ❤️ by the Trixi.jl community. |
Codecov Report❌ Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #2888 +/- ##
==========================================
+ Coverage 97.12% 97.13% +0.01%
==========================================
Files 622 623 +1
Lines 48253 48385 +132
==========================================
+ Hits 46861 46996 +135
+ Misses 1392 1389 -3
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
Co-authored-by: Joshua Lampert <51029046+JoshuaLampert@users.noreply.github.com>
sloede
left a comment
There was a problem hiding this comment.
Except for the missing test coverage (which I think actually would be good to fix, since it covers an important and ugly-to-debug case), this looks ready to merge 👍
…Karpowski/Trixi.jl into MPI_P4est_Parabolic2D_nonconforming
sloede
left a comment
There was a problem hiding this comment.
Very nice - thanks! Let's hope this fixes the problem.
Can you please check how long the two additional tests run on your machine (after warmup)? We have a general "10 second rule" and while this can be stretched with good reason, it shouldn't be too long since it will be much longer even when run on the GitHub Action runners...
|
Thanks for the reminder. The new elixier runs in under 10s. The test only has half the simtime so should stay well within the limits. The other testcase I don't remember exactly, but I did check that it was below 10s at some point. But maybe I could also remove the lid-driven-cavity test again, since the new test should cover all aspects too. |
|
Does anyone understand what's happening with the MPI test timeouts? When I checked last, there seemed to be also an actual test failing (the new one), but GitHub is not working reliably for me at the moment and I can't verify if this is still the case. |
…Karpowski/Trixi.jl into MPI_P4est_Parabolic2D_nonconforming
|
I reduced the tolerance for the new testcase, which might be too tight for MPI-AMR cases. So hopefully the testcase should no longer fail. I don't think, however, that this failure caused the timeout issue. |
If AMR is finicky, we could also consider using a static, non-conforming setup. However, based in your previous tests - where you got identical results for serial and parallel simulations - the parallelization should have the same results, shouldn't it? |
|
For the static grids, yes. But also in the other AMR testcase, I got the same value up to, let's say, tol=1e-7 or 1e-8, not 1e-13. On my machine, the test also passed before the tolerances were adjusted. Given the potentially computer-specific decomposition and thus order of summation in the MPI reduction step, I would accept 1e-8 as sufficient. The test has AMR, so we also test parabolic cache resizing at the same time, thus testing all changes with only 1 new testcase. |
sloede
left a comment
There was a problem hiding this comment.
LGTM! The downstream failure is an HTTP 502 ("bad gateway") error and unrelated to the changes in this PR.
Thanks again for the nice work@

MPI for 2D parabolic system on nonconforming P4est Mesh (AMR with loadbalancing tested)
The PR adds MPI support for the parabolic rhs for 2D P4est Meshes with AMR and load balancing.
The PR needs #2886 and #2881 to be merged first.
The
elixir_navierstokes_lid_driven_cavity_amr_mortarTestcase.jlshould test the MPI mortar treatment and include further refinement, unrefinement, and load balancing due to the rapid change in the refinement region.If you would like to see further testcases let me know. Otherwise, I welcome your feedback and will continue with the 3D cases.
Disclaimer
LLMs have been used to aid in the PR.
Funding Statement
This work has been funded by the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) – Project Number 237267381 – TRR 150.