pylith
pylith copied to clipboard
Dynamic prescribed slip integration - weighting with lumped mass
@baagaard TODO
- [x] Update governing equations section of the manual for dynamic prescribed slip
- [x] Update pointwise functions for dynamic prescribed slip (remove density)
- [x] Update method for computing inverse of lumped mass matrix (separate from computing LHS and RHS residuals) [new weak form part]
@knepley TODO
- [x] Add ability to weight terms in hybrid integration -- residual
- [x] Add ability to weight terms in hybrid integration -- Jacobian
Need separate auxiliary field for mass weighting. Add pre-processing step to pull terms out from lumped Jacobian inverse. Need to use PART to specify which equation for mass weighting.
@knepley I am getting a weird PETSc memory error when trying to run a manual test with dynamic prescribed slip.
Error message
[0]PETSC ERROR: Invalid pointer
[0]PETSC ERROR: Null Pointer: Parameter # 4
--snip--
[0]PETSC ERROR: Petsc Development GIT revision: v3.16.5-1148-g6fcc6d7c86 GIT Date: 2022-03-11 00:55:47 +0000
--snip--
[0]PETSC ERROR: #1 PetscSectionSetFieldConstraintIndices() at /Users/baagaard/software/unix/petsc-dev/src/vec/is/section/interface/section.c:2567
[0]PETSC ERROR: #2 PetscSectionCreateSubsection() at /Users/baagaard/software/unix/petsc-dev/src/vec/is/section/interface/section.c:1854
[0]PETSC ERROR: #3 DMCreateSectionSubDM() at /Users/baagaard/software/unix/petsc-dev/src/dm/interface/dmi.c:195
[0]PETSC ERROR: #4 DMCreateSubDM_Plex() at /Users/baagaard/software/unix/petsc-dev/src/dm/impls/plex/plex.c:3717
[0]PETSC ERROR: #5 DMCreateSubDM() at /Users/baagaard/software/unix/petsc-dev/src/dm/interface/dm.c:2014
[0]PETSC ERROR: #6 PCFieldSplitSetDefaults() at /Users/baagaard/software/unix/petsc-dev/src/ksp/pc/impls/fieldsplit/fieldsplit.c:435
[0]PETSC ERROR: #7 PCSetUp_FieldSplit() at /Users/baagaard/software/unix/petsc-dev/src/ksp/pc/impls/fieldsplit/fieldsplit.c:618
[0]PETSC ERROR: #8 PCSetUp() at /Users/baagaard/software/unix/petsc-dev/src/ksp/pc/interface/precon.c:1017
[0]PETSC ERROR: #9 KSPSetUp() at /Users/baagaard/software/unix/petsc-dev/src/ksp/ksp/interface/itfunc.c:419
[0]PETSC ERROR: #10 KSPSolve_Private() at /Users/baagaard/software/unix/petsc-dev/src/ksp/ksp/interface/itfunc.c:865
[0]PETSC ERROR: #11 KSPSolve() at /Users/baagaard/software/unix/petsc-dev/src/ksp/ksp/interface/itfunc.c:1102
[0]PETSC ERROR: #12 SNESSolve_NEWTONLS() at /Users/baagaard/software/unix/petsc-dev/src/snes/impls/ls/ls.c:225
[0]PETSC ERROR: #13 SNESSolve() at /Users/baagaard/software/unix/petsc-dev/src/snes/interface/snes.c:4810
[0]PETSC ERROR: #14 TSStep_ARKIMEX() at /Users/baagaard/software/unix/petsc-dev/src/ts/impls/arkimex/arkimex.c:845
[0]PETSC ERROR: #15 TSStep() at /Users/baagaard/software/unix/petsc-dev/src/ts/interface/ts.c:3607
[0]PETSC ERROR: #16 TSSolve() at /Users/baagaard/software/unix/petsc-dev/src/ts/interface/ts.c:4006
[0]PETSC ERROR: #17 void pylith::problems::TimeDependent::solve()() at /Users/baagaard/src/cig/pylith/libsrc/pylith/problems/TimeDependent.cc:429
How to reproduce
PETSc branch: knepley/pylith
PyLith branch: baagaard/feature-prescribed-slip-dynamic
(force pushed)
Build PyLith
cd tests/manual/2d/faults-dynamic
pylith step01_swave_prescribedslip.cfg
This is fixed in the latest knepley/pylith
@knepley
PETSc branch: knepley/feature-hybrid-mass PyLith branch: baagaard/feature-prescribed-slip-dynamic (force pushed)
Build PyLith
cd tests/manual/2d/faults-dynamic
pylith step01_swave_prescribedslip.cfg
Error message
[0]PETSC ERROR: Argument out of range
[0]PETSC ERROR: Point 13 not found in submesh
[0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
[0]PETSC ERROR: Petsc Development GIT revision: v3.16.5-1148-g6fcc6d7c86 GIT Date: 2022-03-11 00:55:47 +0000
[0]PETSC ERROR: /Users/baagaard/software/unix/py39-venv/pylith-debug/bin/mpinemesis on a arch-clang-13.0_debug named IGSKCICGLTGM067 by baagaard Thu Mar 17 09:42:12 2022
[0]PETSC ERROR: Configure options --PETSC_ARCH=arch-clang-13.0_debug --with-debugging=1 --with-clanguage=c --with-mpi-compilers=1 --with-shared-libraries=1 --with-64-bit-points=1 --with-large-file-io=1 --with-lgrind=0 --download-chaco=1 --download-parmetis=1 --download-metis=1 --download-triangle --download-ml=1 --download-superlu=1 --with-fc=0 --download-f2cblaslapack --with-hdf5=1 --with-hdf5-include=/Users/baagaard/software/unix/hdf5-1.12.1/clang-13.0/include --with-hdf5-lib=/Users/baagaard/software/unix/hdf5-1.12.1/clang-13.0/lib/libhdf5.dylib --with-zlib=1
[0]PETSC ERROR: #1 DMGetEnclosurePoint() at /Users/baagaard/software/unix/petsc-dev/src/dm/impls/plex/plexsubmesh.c:3996
[0]PETSC ERROR: #2 DMPlexGetHybridScaleFields() at /Users/baagaard/software/unix/petsc-dev/src/dm/impls/plex/plexfem.c:3503
[0]PETSC ERROR: #3 DMPlexComputeResidual_Hybrid_Internal() at /Users/baagaard/software/unix/petsc-dev/src/dm/impls/plex/plexfem.c:5031
[0]PETSC ERROR: #4 static void pylith::feassemble::_IntegratorInterface::computeResidual(pylith::topology::Field *, const pylith::feassemble::IntegratorInterface *, pylith::feassemble::Integrator::ResidualPart, const pylith::problems::IntegrationData &)() at /Users/baagaard/src/cig/pylith/libsrc/pylith/feassemble/IntegratorInterface.cc:603
@baagaard-usgs I updated everything (branch in PETSc and PyLith) and now I get that error in DMPlexComputeJacobian_Action_Internal(), which I think has not been updated yet,
0 TS dt 0.1 time -0.1
[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[0]PETSC ERROR: Argument out of range
[0]PETSC ERROR: Point 0 not found in submesh
[0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
[0]PETSC ERROR: Petsc Development GIT revision: v3.16.0-473-g546643b376b GIT Date: 2021-11-18 15:56:44 -0500
[0]PETSC ERROR: /PETSc3/cig/bin/mpinemesis on a arch-pylith-debug named MacBook-Pro.fios-router.home by knepley Mon Mar 21 10:07:10 2022
[0]PETSC ERROR: Configure options --PETSC_ARCH=arch-pylith-debug --download-chaco --download-ctetgen --download-exodusii --download-metis --download-ml --download-netcdf --download-parmetis --download-pnetcdf --download-triangle --with-cmake-exec=/PETSc3/petsc/apple/bin/cmake --with-ctest-exec=/PETSc3/petsc/apple/bin/ctest --with-fc=0 --with-hdf5-dir=/PETSc3/petsc/apple --with-mpi-dir=/PETSc3/petsc/apple --with-shared-libraries --with-zlib
[0]PETSC ERROR: #1 DMGetEnclosurePoint() at /PETSc3/petsc/petsc-pylith/src/dm/impls/plex/plexsubmesh.c:3996
[0]PETSC ERROR: #2 DMPlexComputeJacobian_Action_Internal() at /PETSc3/petsc/petsc-pylith/src/dm/impls/plex/plexfem.c:5861
[0]PETSC ERROR: #3 virtual void pylith::feassemble::IntegratorDomain::computeLHSJacobianLumpedInv(pylith::topology::Field *, const pylith::problems::IntegrationData &)() at ../../../pylith-git/libsrc/pylith/feassemble/IntegratorDomain.cc:439
If I fix the following error in PETSc, I get your latest error message.
diff --git a/src/dm/interface/dm.c b/src/dm/interface/dm.c
index bdb78c8a63..b0451bca41 100644
--- a/src/dm/interface/dm.c
+++ b/src/dm/interface/dm.c
@@ -5097,7 +5097,7 @@ PetscErrorCode DMAddField(DM dm, DMLabel label, PetscObject field)
PetscErrorCode DMSetFieldAvoidTensor(DM dm, PetscInt f, PetscBool avoidTensor)
{
PetscFunctionBegin;
- PetscCheck((f > 0) && (f < dm->Nf),PetscObjectComm((PetscObject) dm), PETSC_ERR_ARG_OUTOFRANGE, "Field %D is not in [0, %D)", f, dm->Nf);
+ PetscCheck((f >= 0) && (f < dm->Nf),PetscObjectComm((PetscObject) dm), PETSC_ERR_ARG_OUTOFRANGE, "Field %D is not in [0, %D)", f, dm->Nf);
dm->fields[f].avoidTensor = avoidTensor;
PetscFunctionReturn(0);
}
Yes, I fixed that, but forgot to push to pylith.
So does this mean the residual is at least not crashing?
Yes, it looks like the residual is not crashing. Now, I should be able to check if the weighting is being applied for the residual.
@knepley There is a problem with populating the weighting vector s
in DMPLexComputeResidual_Hybrid_Internal()
. I am seeing some legitimate values but mostly garbage values. I think there are bugs in pulling out the values from locS
into s
in DMPlexGetHybridScaleFields()
.
There may be a issues in DMPlexGetHybridScaleFields()
.
- The
totDimAux
is [12, 12] but I think the size should matchtotDim
inDMPLexComputeResidual_Hybrid_Internal()
, which is 20 [2 vertices * (2 disp + 2 vel) = 8 for both negative and positive faces + 2 hybrid edges * 2 lagrange]. I don't know which size is correct; perhaps both are wrong. - I don't think the weighting array is getting initialized to zeros. In setting the values,
Na=4
, so only the first four values ofal
get set; the rest look like garbage to me.Na=4
makes sense to me because there are 2 vertices on a face and the lumped Jacobian inverse has 2 DOF per vertex.
Another issue is that you have a scaling vector for the cohesive element vector, but never allocate it, set the values, or use it. For this to be of general use, I think scaling of the cohesive values needs to be included.
Okay, there are two fields in the scaling DS, velocity and lagrange multipler. Since the DS is marked as having a cohesive cell, the velocity dofs are stored for each side (even though we are only contributing one here), and there is space for the lagrange variables.
@knepley There is something messed up in our calculation of the tractions (stresses) on the end caps. Our kernels are https://pylith.readthedocs.io/en/latest/user/governingeqns/elasticity-infstrain-prescribedslip/dynamic.html#residual-pointwise-functions
We need the traction (stress) on the negative and positive faces computed from the deformation of the adjacent cells. Currently, I am assuming that the gradient in the solution (displacement) will have legitimate gradients along the boundary and perpendicular to the boundary when I am in the kernels for the negative and positive faces; it looks like the gradient in the solution (displacement) perpendicular to the boundary is always 0.
Okay, yes this is a problem. The cohesive cell is phrased as a lower dimensional FE space. Thus we can get pointwise values accurately on the fault, and derivatives along the fault, but derivative perpendicular to the fault will not exist in the space. I think we might have to compute that as an aux field for the fault.
For the fields defined over the domain, instead of using the basis for the cohesive cell, I think what we really want are the basis functions and derivatives of the basis functions of the adjoining cell evaluated on the face of the fault.
@knepley It looks like applying mass weighting to the Jacobian is not implemented.
I need to figure out the ordering of the Jacobian term Jf1lu. It has 16 entries because it includes entries for both sides of the fault for the displacement field.
This is mostly resolved. I need to figure out issues with the magnitude.