libCEED
libCEED copied to clipboard
PETSc BPs with tetrahedral elements
Added tet element to example/petsc (https://github.com/CEED/libCEED/issues/454).
- Added
CreateBasisFromPlexfunction inpetscutils.cto handle non-tensor basis. - Added
CreateDistributedDMfunction inpetscutils.cto have cleaner createdminbps.c - Updated
CeedLevelTransferSetupfunction by removingfor-loopof levels inside and useCeedOperatorMultigridLevelCreatefor creatingop_prolongandop_restrict. Removedqf_prolongandqf_restrictinsidemultigrid.c - Added a condition for
PCsotetelement usePC_JACOBI_DIAGONALfor bp1 and bp2. - Added
TESTARGSfortetin bps.c and multigrid.c
To run with tet element add -simplex to the sample runs we have.
@jedbrown, @knepley, @jeremylt the only issue left in this branch is that
ierr = DMPlexCreateBoxMesh(PETSC_COMM_WORLD, rp->dim, rp->simplex,
rp->mesh_elem,
NULL, NULL, NULL, PETSC_TRUE, dm); CHKERRQ(ierr);
doesn't create Face Sets for simplex case, and I don't know how to fix it.
./bps -problem bp1 -degree 3 -dm_view -simplex -ksp_max_it 1000
DM Object: 1 MPI processes
type: plex
DM_0x84000001_1 in 3 dimensions:
Number of 0-cells per rank: 64
Number of 1-cells per rank: 279
Number of 2-cells per rank: 378
Number of 3-cells per rank: 162
Labels:
marker: 1 strata with value/size (1 (172))
celltype: 4 strata with value/size (0 (64), 6 (162), 3 (378), 1 (279))
depth: 4 strata with value/size (0 (64), 1 (279), 2 (378), 3 (162))
-- CEED Benchmark Problem 1 -- libCEED + PETSc --
MPI:
Hostname : rezgar
Total ranks : 1
Ranks per compute node : 1
PETSc:
PETSc Vec Type : seq
libCEED:
libCEED Backend : /cpu/self/avx/blocked
libCEED Backend MemType : host
Mesh:
Number of 1D Basis Nodes (P) : 4
Number of 1D Quadrature Points (Q) : 5
Additional quadrature points (q_extra) : 1
Global nodes : 1000
Local Elements : 162
Element topology : tetrahedron
Owned nodes : 1000
DoF per node : 1
KSP:
KSP Type : cg
KSP Convergence : CONVERGED_RTOL
Total KSP Iterations : 61
Final rnorm : 1.300424e-10
Performance:
Pointwise Error (max) : 6.357310e-04
CG Solve Time : 0.00557013 (0.00557013) sec
DoFs/Sec in CG : 10.9513 (10.9513) million
I think PETSc should traverse the boundary faces, find their outward facing normal, and label them using the same convention as is used for hex meshes. It's frankly embarrassing that it doesn't, especially when simplex meshes are default.
I have said before that there is a function for this: https://petsc.org/main/docs/manualpages/DMPLEX/DMPlexMarkBoundaryFaces.html
And we've discussed that
- doing different setup with tet vs hex meshes is bad for comparison studies and
- this is not sufficient because we need the 6 faces marked with different label values because we need to impose different boundary conditions on them
It should be done correctly by DMPlex, with matching labels between tet and hex meshes.
Thanks, we already use that function, here
// -----------------------------------------------------------------------------
// Create BC label
// -----------------------------------------------------------------------------
static PetscErrorCode CreateBCLabel(DM dm, const char name[]) {
int ierr;
DMLabel label;
PetscFunctionBeginUser;
ierr = DMCreateLabel(dm, name); CHKERRQ(ierr);
ierr = DMGetLabel(dm, name, &label); CHKERRQ(ierr);
ierr = DMPlexMarkBoundaryFaces(dm, 1, label); CHKERRQ(ierr);
PetscFunctionReturn(0);
};
and we call CreateBCLabel here
if (enforce_bc) {
PetscBool has_label;
DMHasLabel(dm, "marker", &has_label);
if (!has_label) {CreateBCLabel(dm, "marker");}
DMLabel label;
ierr = DMGetLabel(dm, "marker", &label); CHKERRQ(ierr);
ierr = DMAddBoundary(dm, DM_BC_ESSENTIAL, "wall", label, 1,
marker_ids, 0, 0, NULL, (void(*)(void))bc_func,
NULL, NULL, NULL); CHKERRQ(ierr);
}
I need to check why it doesn't work for simplex.
And we've discussed that
- doing different setup with tet vs hex meshes is bad for comparison studies and
- this is not sufficient because we need the 6 faces marked with different label values because we need to impose different boundary conditions on them
It should be done correctly by DMPlex, with matching labels between tet and hex meshes.
I had held off because TetGen was broken with respect to label propgagation, so it would not work. I think Brandon has fixed it, so we can try again.
Note - this branch needs the updates in main for the latest development version of PETSc: #940
I rebased on main, but this branch is failing in the same way it was before (for me).
I rebased on main, but this branch is failing in the same way it was before (for me).
Hmm, I'll fix it next week.
Let us know when this is ready for last round of review.
I fixed the restriction's size problem but now I get this error!!
./multigrid -problem bp3
-- CEED Benchmark Problem 3 -- libCEED + PETSc + PCMG --
PETSc:
PETSc Vec Type : seq
libCEED:
libCEED Backend : /cpu/self/avx/blocked
libCEED Backend MemType : host
Mesh:
Number of 1D Basis Nodes (P) : 3
Number of 1D Quadrature Points (Q) : 4
Additional quadrature points (q_extra) : 1
Global Nodes : 125
Owned Nodes : 125
DoF per node : 1
Element topology : hexahedron
Multigrid:
Number of Levels : 2
Level 0 (coarse):
Number of 1D Basis Nodes (P) : 2
Global Nodes : 8
Owned Nodes : 8
Level 1 (fine):
Number of 1D Basis Nodes (P) : 3
Global Nodes : 125
Owned Nodes : 125
/home/rezgar/libCEED/interface/ceed-basis.c:205 in CeedBasisCreateProjectionMatrices(): Bases must have compatible quadrature spaces
Aborted (core dumped)
You need to force compatible quadrature spaces by setting the quadrature order on coarse levels to match the fine level. See RatelSetupDMByOrder_FEM()
You need to force compatible quadrature spaces by setting the quadrature order on coarse levels to match the fine level. See
RatelSetupDMByOrder_FEM()
Hmm, you mean this line of code?
PetscInt dim, q_order = ratel->multigrid_fine_order + ratel->q_extra;
Because right now I have
PetscInt q_degree = p_degree + q_extra;
PetscFE fe;
MPI_Comm comm;
PetscBool is_simplex = PETSC_TRUE;
PetscFunctionBeginUser;
// Check if simplex or tensor-product mesh
ierr = DMPlexIsSimplex(dm, &is_simplex); CHKERRQ(ierr);
// Setup FE
ierr = PetscObjectGetComm((PetscObject)dm, &comm); CHKERRQ(ierr);
ierr = PetscFECreateLagrange(comm, dim, num_comp_u, is_simplex, p_degree,
q_degree, &fe); CHKERRQ(ierr);
ierr = DMAddField(dm, NULL, (PetscObject)fe); CHKERRQ(ierr);
ierr = DMCreateDS(dm); CHKERRQ(ierr);
So I need to set p_degree to p_degree_fine.
exactly
Fixed now, thanks!
@jedbrown , @jeremylt , this is now ready for review if we want to skip the issue we have for applying boundary on tet as discussed here, otherwise everything that mentioned in description is implemented.
This has a couple conflicts with main and thus needs a rebase or merge. CI will run once that is done.
This has a couple conflicts with main and thus needs a rebase or merge. CI will run once that is done.
This is rebased and ready for merge.
Pipeline is running now.
Pipeline is running now.
I rebased again, why ci is not running?
Looks like CPU tests fail due to large pointwise errors. https://gitlab.com/libceed/libCEED/-/pipelines/632636153/test_report
Does ci run with simplex? I put TESTARGS with simplex in bps.c multigrid.c but they do not work. Should I remove that? Tensor basis are fine, I've tested them.
Yeah, CI tries to run those cases. Can we fix them so it passes a reasonable test? (I realize there is other good stuff in the branch, but that is the title of this PR so ideally it would work when we merge it.)
Yeah, CI tries to run those cases. Can we fix them so it passes a reasonable test? (I realize there is other good stuff in the branch, but that is the title of this PR so ideally it would work when we merge it.)
As we discussed here, DMPlex doesn't create face sets to apply boundary conditions when using simplex, I don't know how to fix that.
Yeah, CI tries to run those cases. Can we fix them so it passes a reasonable test? (I realize there is other good stuff in the branch, but that is the title of this PR so ideally it would work when we merge it.)
As we discussed here,
DMPlexdoesn't create face sets to apply boundary conditions when usingsimplex, I don't know how to fix that.
Do we have it in Ratel so I can copy it here?
Are you using PetscCall(DMPlexMarkBoundaryFaces(dm, PETSC_DETERMINE, label));?
Yeah, here is the code
// -----------------------------------------------------------------------------
// Create BC label
// -----------------------------------------------------------------------------
static PetscErrorCode CreateBCLabel(DM dm, const char name[]) {
int ierr;
DMLabel label;
PetscFunctionBeginUser;
ierr = DMCreateLabel(dm, name); CHKERRQ(ierr);
ierr = DMGetLabel(dm, name, &label); CHKERRQ(ierr);
ierr = DMPlexMarkBoundaryFaces(dm, 1, label); CHKERRQ(ierr);
PetscFunctionReturn(0);
};
Why 1 instead of PETSC_DETERMINE?
I didn't touch that, it was like that. I tried with PETSC_DETERMINE and nothing happened