candi
candi copied to clipboard
configuration with Intel compilers is broken
Intel compilers do not configure correctly due to https://github.com/dealii/candi/pull/159
We used to do
FC=mpiifort CXX=mpiicpc CC=mpiicc cmake -D DEAL_II_WITH_MPI:BOOL=ON ..
which works correctly. But, according to https://github.com/dealii/dealii/issues/11478#issuecomment-754954736 I should do:
CXX=icpc CC=icc cmake -D DEAL_II_WITH_MPI=ON -D MPI_CXX_COMPILER=mpiicpc -D MPI_C_COMPILER=mpiicc ..
but this fails with:
Imported target "MPI::MPI_C" includes non-existent path
"/home/heister/deal.ii-candi/tmp/unpack/deal.II-v9.3.0/'/software/external/intelmpi/oneapi/mpi/2021.1.1/include'"
in its INTERFACE_INCLUDE_DIRECTORIES. Possible reasons include:
Not setting CXX also fails (it picks up the system g++ and bails).
How do I know when to set CXX to the mpi wrapper and automatically?
cmake 3.16.3, intel 19.1.3
How do I know when to set CXX to the mpi wrapper and automatically?
I think there are compilers out there, for which we can't know it automatically. I remember having issues with an older intel compiler series.
Maybe we should introduce a special case for intel compilers, as you suggested.
Or, we undo #159 and find a different solution to that. In the comments of #159 I remarked that this may lead to issues for some compilers.
"/home/heister/deal.ii-candi/tmp/unpack/deal.II-v9.3.0/'/software/external/intelmpi/oneapi/mpi/2021.1.1/include'"
It's pretty weird that there are additional apostrophes in that path. I guess it should just be /software/external/intelmpi/oneapi/mpi/2021.1.1/include
?
@tjhei There is definitely something funny with the toolchain.
Testing your configuration line
CXX=icpc CC=icc cmake -D DEAL_II_WITH_MPI=ON -D MPI_CXX_COMPILER=mpiicpc -D MPI_C_COMPILER=mpiicc ..
works like a charm on TACC resources (the only intel compiler I have access to).
The mpiicc
, mpicpc
wrappers are simple shell scripts. Just for confirmation - what happens with the following (assuming intelmpi)?
% export CC=icc
% export CXX=icpc
% export I_MPI_CC=icc
% export I_MPI_CXX=icpc
% mpicc -V
Intel(R) C Intel(R) 64 Compiler for applications running on Intel(R) 64, Version 19.1.1.217 Build 20200306
% cmake -DWITH_MPI=ON ..
works like a charm on TACC resources (the only intel compiler I have access to).
@zjiaqi2018 this did not work for you on frontera but hangs in DEAL_II_HAVE_USABLE_FLAGS_DEBUG right?
It's pretty weird that there are additional apostrophes in that path.
Yes, but I can not find it in env
, so I wonder if the cmake MPI detection is broken.
works like a charm on TACC resources (the only intel compiler I have access to).
@zjiaqi2018 this did not work for you on frontera but hangs in DEAL_II_HAVE_USABLE_FLAGS_DEBUG right?
Yes, after more than half an hour, still here...
The
mpiicc
,mpicpc
wrappers are simple shell scripts. Just for confirmation - what happens with the following (assuming intelmpi)?
Same error with the broken include path. I guess it would work if that path wasn't there. I think the broken path is because of the apostrophe in:
$ mpicxx -show
g++ -I'/software/external/intelmpi/oneapi/mpi/2021.1.1/include' -L'/software/external/intelmpi/oneapi/mpi/2021.1.1/lib/release' -L'/software/external/intelmpi/oneapi/mpi/2021.1.1/lib' -Xlinker --enable-new-dtags -Xlinker -rpath -Xlinker '/software/external/intelmpi/oneapi/mpi/2021.1.1/lib/release' -Xlinker -rpath -Xlinker '/software/external/intelmpi/oneapi/mpi/2021.1.1/lib' -lmpicxx -lmpifort -lmpi -lrt -lpthread -Wl,-z,now -Wl,-z,relro -Wl,-z,noexecstack -Xlinker --enable-new-dtags -ldl
$ I_MPI_CXX=icpc mpicxx -show
icpc -I'/software/external/intelmpi/oneapi/mpi/2021.1.1/include' -L'/software/external/intelmpi/oneapi/mpi/2021.1.1/lib/release' -L'/software/external/intelmpi/oneapi/mpi/2021.1.1/lib' -Xlinker --enable-new-dtags -Xlinker -rpath -Xlinker '/software/external/intelmpi/oneapi/mpi/2021.1.1/lib/release' -Xlinker -rpath -Xlinker '/software/external/intelmpi/oneapi/mpi/2021.1.1/lib' -lmpicxx -lmpifort -lmpi -ldl -lrt -lpthread
but I still blame cmake for that.
Updating cmake from 3.16 to 3.20 fixes it... :-(