Tzanio Kolev
Tzanio Kolev
@victorapm, is this `ready-for-review`?
This PR is now [under review](https://github.com/mfem/mfem/blob/master/CONTRIBUTING.md#pull-requests) (see the table in the PR description). To help with the review process, please **do not force push** to the branch.
@stefanozampini, can you take a look?
> What do you think about requiring PETSc to be configured with HYPRE support and dropping the MFEM code when `PETSC_HAVE_HYPRE` is not defined? See #5045 I am personally OK...
This PR is now [under review](https://github.com/mfem/mfem/blob/master/CONTRIBUTING.md#pull-requests) (see the table in the PR description). To help with the review process, please **do not force push** to the branch.
Merged in `next` for testing...
This PR is now [under review](https://github.com/mfem/mfem/blob/master/CONTRIBUTING.md#pull-requests) (see the table in the PR description). To help with the review process, please **do not force push** to the branch.
> @tzanio, do you know why the elasticity options might generate interpolation matrices with very large weights in parallel? For example, running `mpirun -np 4 ./ex2p -m ../data/beam-tet.mesh -elast -vdim`...
> Passing `false` for `interp_refine` seems to fix the bug for me We added this as [an option](https://github.com/mfem/mfem/pull/3772) because users wanted to pass `false`. Should we make that the default...
As @psocratis indicated, this is a solver issue. AMG is an iterative method, so you should either take more iterations or use a looser tolerance.