Yggdrasil icon indicating copy to clipboard operation
Yggdrasil copied to clipboard

[PETSc]: build for windows & adding external solvers

Open boriskaus opened this issue 4 years ago • 15 comments

  • Builds basic PETSc on windows machines as well
  • Adds support for external direct solver packages SuperLU_dist on all systems; add Suitesparse where it'll work.

Having a basic PETSc version that works on all architecture is particularly useful if PETSc.jl is used for teaching. On non-windows machines (or within WSL) this will also allow parallel direct solves. It will thus enable us to create BinaryBuilder versions of other codes that heavily use such solvers (e.g., LaMEM).

boriskaus avatar Mar 17 '22 17:03 boriskaus

Am I missing something or the only skipped platform is 32-bit Windows? Is it really that important?

CC @Wimmerer

giordano avatar Mar 17 '22 23:03 giordano

Yes that was the only missing platform previously (although the 3.16.5 build should be tested that it can be dlopened now).

It is nice to have these additional dependencies like SuiteSparse, but I think it's important we don't vendor SuiteSparse 8 times (same with SuperLU_dist). I can look at a superLU_dist builder, and we'll have to figure out what to do for SuiteSparse

rayegun avatar Mar 17 '22 23:03 rayegun

it's important we don't vendor SuiteSparse 8 times (same with SuperLU_dist)

I agree that this should ideally be compiled as separate packages.

Yet, please take into account that PETSc is build for 32 and 64 bit integers and for both single, float and complex numbers. I'm not the best expert on this, but it seems to me that at least superlu_dist can take advantage of some of this (e.g., 32 or 64 bit integers & complex/double). That implies that we would require an external superlu_dist build for each of these options as well and link that accordingly. SuiteSparse has less supported options, so that might be somewhat easier to handle.

Also, sometimes PETSc-specific patches for these packages are distributed for a particular version, which would have to be taken into account in the external builds as well. That's why I think it is probably more consistent to stick with the PETSc configure system.

boriskaus avatar Mar 21 '22 22:03 boriskaus

I’ll work on it tomorrow, was traveling. This might be a matter of building PETSc with the --download options and then patchelf in the worst case (but we can’ t do that on all plats).

For SuiteSparse I’m going to try just building against SuiteSparse_jll. At least KLU and I think UMFPACK support 32-bit and 64-bit indices in normal build. I’m honestly not sure how that works on 32-bit platforms though. Although Trilinos_jll depends on SuiteSparse32_jll, which is not a good sign.

w.r.t. the patches hopefully they can be found and applied. If there’s no explosion then at worst we vendor everything again, but I worry about what happens when we load SuiteSparse_jll and PETSc_jll? @giordano, is this risky stuff to vendor (everything is in PETSc subdirectories)? I assume yes, otherwise we could stomach the extra download size.

rayegun avatar Mar 21 '22 23:03 rayegun

It would be easiest to use the regular 32-bit BLAS/UMFPACK/etc. just to get everything building correctly first and not having to worry about all the ILP64 suffixes.

ViralBShah avatar Apr 19 '22 03:04 ViralBShah

@giordano: this now compiles fine on windows/Mac/linux and provides a parallel direct solver for the PETSc build (using the updated SuperLU_DIST_jll build). This is very helpful for LaMEM_jll. As far as I'm concerned, this can be merged (MUMPS_jll creates more problems, but I can look at that at a later stage).

boriskaus avatar Jul 26 '22 20:07 boriskaus

Having troubles with 32-bit platforms?

giordano avatar Jul 26 '22 23:07 giordano

Indeed - is there a way to detect all those platforms?

boriskaus avatar Jul 27 '22 05:07 boriskaus

SuperLU_Dist is compiled against mpich right now I believe. That needs to change right?

rayegun avatar Jul 29 '22 23:07 rayegun

That is required if you want to run SuperLU_Dist on a GPU. Is that an immediate priority?

boriskaus avatar Jul 30 '22 08:07 boriskaus

Can we build it against the MPI platform instead of MPICH directly? You can check the other MPI recipes in Yggdrasil

vchuravy avatar Aug 12 '22 18:08 vchuravy

The current version of PETSc (or any other MPI-using package!) already builds against the MPI platform. Merging in the changes from the master branch should do so.

eschnett avatar Aug 13 '22 15:08 eschnett

thanks - that worked smoothly. Seems ready now.

boriskaus avatar Aug 14 '22 07:08 boriskaus

This doesn't look quite ready yet. There is a @show statement left over, and too many MPI configurations are disabled.

eschnett avatar Aug 14 '22 18:08 eschnett

I believe that adding new dependencies requires increasing the version number of the generated package. (It is fine if this makes it go out of sync with the upstream PETSc version number.)

eschnett avatar Aug 14 '22 18:08 eschnett

@eschnett we currently have a failure with the following error when using mpitrampoline::

[14:22:13] /workspace/srcdir/petsc-3.16.6/src/mat/impls/scalapack/matscalapack.c:20:45: error: initializer element is not a compile-time constant
[14:22:13] static PetscMPIInt Petsc_ScaLAPACK_keyval = MPI_KEYVAL_INVALID;

Is that perhaps something that is related to MPITrampoline?

If I compile it locally with:

julia build_tarballs.jl --debug --verbose  aarch64-apple-darwin-libgfortran5-mpi+mpich

it works fine.

edit: seems to work once I add a patch for Petsc_ScaLAPACK_keyval (CI is running)

boriskaus avatar Aug 31 '22 07:08 boriskaus

@eschnett @giordano : this compiles fine now. Please have a look if you are happy with it.

boriskaus avatar Aug 31 '22 12:08 boriskaus