elmerfem icon indicating copy to clipboard operation
elmerfem copied to clipboard

Compile elmerFEM natively for apple silicon?

Open ghostforest opened this issue 1 year ago • 21 comments
trafficstars

Is elmer supposed to compile natively for M1 M2 M3 macs? I did not get it working. I switched to rosetta environment for compilation.

This is what worked:

arch -x86_64 /usr/bin/env bash --login

cmake -DCMAKE_C_COMPILER=/usr/local/Cellar/gcc/14.1.0_2/bin/gcc-14
-DCMAKE_CXX_COMPILER=/usr/local/Cellar/gcc/14.1.0_2/bin/g++-14
-DCMAKE_Fortran_COMPILER=/usr/local/Cellar/gcc/14.1.0_2/bin/gfortran
-DWITH_ELMERGUI=TRUE
-DWITH_OpenMP=TRUE
-DWITH_MPI=TRUE
-DMPI_C_COMPILER=/usr/local/opt/open-mpi/bin/mpicc
-DMPI_CXX_COMPILER=/usr/local/opt/open-mpi/bin/mpicxx
-DMPI_Fortran_COMPILER=/usr/local/opt/open-mpi/bin/mpifort
-DWITH_QT5=TRUE
-DWITH_QWT=FALSE
-DQWT_INCLUDE_DIR=/usr/local/opt/qwt/lib/qwt.framework/Headers
-DQWT_LIBRARY=/usr/local/opt/qwt/lib/qwt.framework/qwt
-DOCC_INCLUDE_DIR=$(brew --prefix opencascade)/include/opencascade
-DOCC_LIBRARY_DIR=$(brew --prefix opencascade)/lib
-DCMAKE_PREFIX_PATH=$(brew --prefix qt@5)
-DCMAKE_INSTALL_PREFIX=../install .. --log-level=DEBUG 
make -j$(sysctl -n hw.logicalcpu)

make install


ghostforest avatar Jul 23 '24 15:07 ghostforest

Switching to clang for compiling elmer really means trouble. Problems with fortran, openmpi...

ghostforest avatar Jul 24 '24 07:07 ghostforest

There is no problem compiling Open MPI with Clang and you are free to combine Clang as the C and C++ compiler with gfortran as the Fortran compiler.

jeffhammond avatar Jul 24 '24 11:07 jeffhammond

Normally when people say something doesn't compile, they include the error messages and log files associated with said failure.

jeffhammond avatar Jul 24 '24 11:07 jeffhammond

Maybe you can share the steps you take on compiling elmer then? I spent 4 hours yesterday and one solved problem led to another. First openmpi was not compatible, then openmpi was not found, then gfortran was not working, then umfpack. So I think its easier if you got it working to share how you did it?

cmake -DCMAKE_C_COMPILER=/usr/bin/clang \
      -DCMAKE_CXX_COMPILER=/usr/bin/clang++ \
      -DCMAKE_Fortran_COMPILER=/opt/homebrew/bin/gfortran \
      -DWITH_ELMERGUI=TRUE \
      -DWITH_OpenMP=TRUE \
      -DWITH_MPI=TRUE \
      -DMPI_C_COMPILER=/opt/homebrew/bin/mpicc \
      -DMPI_CXX_COMPILER=/opt/homebrew/bin/mpicxx \
      -DMPI_Fortran_COMPILER=/opt/homebrew/bin/mpifort \
      -DMPIEXEC=/opt/homebrew/bin/mpiexec \
      -DQWT_INCLUDE_DIR=/opt/homebrew/opt/qwt/lib/qwt.framework/Headers \
      -DQWT_LIBRARY=/opt/homebrew/opt/qwt/lib/qwt.framework/qwt \
      -DOCC_INCLUDE_DIR=$(brew --prefix opencascade)/include/opencascade \
      -DOCC_LIBRARY_DIR=$(brew --prefix opencascade)/lib \
      -DCMAKE_PREFIX_PATH=$(brew --prefix qt@5) \
      -DCMAKE_INSTALL_PREFIX=../install \
      -DOpenMP_C_FLAGS="-Xpreprocessor -fopenmp -I$(brew --prefix libomp)/include" \
      -DOpenMP_C_LIB_NAMES="omp" \
      -DOpenMP_CXX_FLAGS="-Xpreprocessor -fopenmp -I$(brew --prefix libomp)/include" \
      -DOpenMP_CXX_LIB_NAMES="omp" \
      -DOpenMP_omp_LIBRARY=$(brew --prefix libomp)/lib/libomp.dylib \
      .. --log-level=DEBUG

CMake Error at /opt/homebrew/Cellar/cmake/3.30.1/share/cmake/Modules/FindPackageHandleStandardArgs.cmake:233 (message):
  Could NOT find MPI (missing: MPI_Fortran_FOUND) (found version "3.1")
Call Stack (most recent call first):
  /opt/homebrew/Cellar/cmake/3.30.1/share/cmake/Modules/FindPackageHandleStandardArgs.cmake:603 (_FPHSA_FAILURE_MESSAGE)
  /opt/homebrew/Cellar/cmake/3.30.1/share/cmake/Modules/FindMPI.cmake:1841 (find_package_handle_standard_args)
  CMakeLists.txt:230 (FIND_PACKAGE)

This cant be the correct cmake command. Id suppose just typing cmake .. from the build directory would work but it does not.

ghostforest avatar Jul 24 '24 11:07 ghostforest

The MacOS build on M2 is almost trivial:

git clone --recursive https://github.com/ElmerCSC/elmerfem.git
cd elmerfem
mkdir build
cd build
MPI_HOME=/opt/homebrew/Cellar/open-mpi/5.0.3_1 CMAKE_INSTALL_PREFIX=$HOME/Work/Apps/ELMER/install CFLAGS="-std=gnu89" cmake ..
cmake --build .
cmake --install .
ctest

jeffhammond avatar Jul 24 '24 12:07 jeffhammond

You should not set OpenMP options on MacOS if you use Apple Clang. Apple Clang does not support OpenMP. Both Clang and GCC from Homebrew do, on the other hand.

jeffhammond avatar Jul 24 '24 12:07 jeffhammond

Will it even compile with MPI without setting

-DWITH_OpenMP=TRUE \
-DWITH_MPI=TRUE \

?

Now I get this error when using your command:

12 warnings generated.
[  1%] Linking C shared library libmatc.dylib
[  1%] Built target matc
[  1%] Building C object matc/src/CMakeFiles/Matc_bin.dir/main.c.o
[  1%] Linking C executable matc
[  1%] Built target Matc_bin
[  1%] Built target umfpack_srcs
[  1%] Generating umfpack_zl_wsolve.c
In file included from /Users/User/Dev/elmerfem/umfpack/src/umfpack/umfpack_solve.c:31:
/Users/User/Dev/elmerfem/umfpack/src/umfpack/include/umf_internal.h:29:10: fatal error: 'string.h' file not found
#include <string.h>
         ^~~~~~~~~~
1 error generated.
make[2]: *** [umfpack/src/umfpack/umfpack_zl_wsolve.c] Error 1
make[2]: *** Deleting file `umfpack/src/umfpack/umfpack_zl_wsolve.c'
make[1]: *** [umfpack/src/umfpack/CMakeFiles/umfpack.dir/all] Error 2
make: *** [all] Error 2

ghostforest avatar Jul 24 '24 12:07 ghostforest

But that is the command I used...

Id like to compile elmer with GUI and with OpenMPI

ghostforest avatar Jul 24 '24 12:07 ghostforest

Sometimes Apple breaks their C toolchain. If the C standard headers aren't found, try reinstalling XCode command-line tools.

jeffhammond avatar Jul 24 '24 12:07 jeffhammond

I got it to seemingly correctly compile with:

MPI_HOME=/opt/homebrew/Cellar/open-mpi/5.0.3_1 \
cmake -DCMAKE_C_COMPILER=/opt/homebrew/opt/llvm/bin/clang \
      -DCMAKE_CXX_COMPILER=/opt/homebrew/opt/llvm/bin/clang++ \
      -DMPI_C_COMPILER=/opt/homebrew/opt/open-mpi/bin/mpicc \
      -DMPI_CXX_COMPILER=/opt/homebrew/opt/open-mpi/bin/mpicxx \
      -DMPI_Fortran_COMPILER=/opt/homebrew/opt/open-mpi/bin/mpifort \
      -DWITH_ELMERGUI=TRUE \
      -DWITH_QT5=TRUE \
      -DWITH_QWT=FALSE \
      -DWITH_OpenMP=TRUE \
      -DWITH_MPI=TRUE \
      -DQWT_INCLUDE_DIR=/usr/local/opt/qwt/lib/qwt.framework/Headers \
      -DQWT_LIBRARY=/usr/local/opt/qwt/lib/qwt.framework/qwt \
      -DOCC_INCLUDE_DIR=$(brew --prefix opencascade)/include/opencascade \
      -DOCC_LIBRARY_DIR=$(brew --prefix opencascade)/lib \
      -DCMAKE_PREFIX_PATH=$(brew --prefix qt@5) \
      -DCMAKE_C_FLAGS="-fopenmp" \
      -DCMAKE_CXX_FLAGS="-fopenmp" \
      -DCMAKE_EXE_LINKER_FLAGS="-L/opt/homebrew/opt/libomp/lib -lomp" \
      -DCMAKE_INSTALL_PREFIX=../install .. --log-level=DEBUG

however the shell setting seems to be crucial:

export PATH="/opt/homebrew/opt/open-mpi/bin:$PATH"
export LDFLAGS="-L/opt/homebrew/opt/open-mpi/lib $LDFLAGS"
export CPPFLAGS="-I/opt/homebrew/opt/open-mpi/include $CPPFLAGS"
export OMPI_CC=/opt/homebrew/opt/llvm/bin/clang
export OMPI_CXX=/opt/homebrew/opt/llvm/bin/clang++
export OMPI_FC=/usr/local/bin/gfortran

# OpenMP settings
export LD_LIBRARY_PATH=/opt/homebrew/opt/libomp/lib:$LD_LIBRARY_PATH
export DYLD_LIBRARY_PATH=/opt/homebrew/opt/libomp/lib:$DYLD_LIBRARY_PATH 

however 87% tests passed, 122 tests failed out of 922

The following tests FAILED:
	124 - CurvedBoundaryCylH_np3 (Failed)
	125 - CurvedBoundaryCylH_np8 (Failed)
	243 - FixTangentVelo (Failed)
	248 - H1BasisEvaluation (Failed)
	249 - HarmonicNS (Failed)
	264 - HelmholtzFEM (Failed)
	271 - HelmholtzStructure (Failed)
	272 - HelmholtzStructure2 (Failed)
	273 - HelmholtzStructure3 (Failed)
	275 - Hybrid2dMeshPartitionMetis_np8 (Failed)
	276 - Hybrid2dMeshPartitionMetisConnect_np8 (Failed)
	278 - HydrostaticNSVec-ISMIP-HOM-C (Failed)
	285 - InductionHeating2 (Failed)
	286 - InductionHeating3 (Failed)
	339 - MazeMeshPartitionMetisContig_np6 (Failed)
	355 - ModelPDEthreaded (Failed)
	357 - MonolithicSlave2 (Failed)
	381 - NonnewtonianChannelFlow_vec (Failed)
	430 - PlatesEigenComplex (Failed)
	431 - PlatesHarmonic (Failed)
	503 - SD_H1BasisEvaluation (Failed)
	504 - SD_HarmonicNS (Failed)
	505 - SD_LinearFormsAssembly (Failed)
	511 - SD_NonnewtonianChannelFlow_vec (Failed)
	525 - SD_Step_stokes_heat_vec (Failed)
	526 - SD_Step_stokes_vec (Failed)
	527 - SD_Step_stokes_vec_blockprec (Failed)
	567 - Shell_with_Solid_Beam_EigenComplex (Failed)
	573 - ShoeboxFsiHarmonicPlate (Failed)
	574 - ShoeboxFsiStatic (Failed)
	576 - ShoeboxFsiStaticShell (Failed)
	587 - StatCurrentVec2 (Failed)
	600 - Step_stokes_heat_vec (Failed)
	601 - Step_stokes_vec (Failed)
	602 - Step_stokes_vec_blockprec (Failed)
	611 - StressConstraintModes3 (Failed)
	616 - TEAM30a_3ph_transient (Failed)
	637 - VectorHelmholtzImpMatrix (Failed)
	638 - VectorHelmholtzWaveguide (Failed)
	639 - VectorHelmholtzWaveguide2 (Failed)
	640 - VectorHelmholtzWaveguide3 (Failed)
	641 - VectorHelmholtzWaveguide4 (Failed)
	643 - VectorHelmholtzWaveguideNodal (Failed)
	644 - VectorHelmholtzWaveguideQuadBlock (Failed)
	645 - VectorHelmholtzWaveguide_TM (Failed)
	655 - WinkelPartitionMetis_np8 (Failed)
	656 - WinkelPartitionMetisConnect_np8 (Failed)
	657 - WinkelPartitionMetisRec_np8 (Failed)
	673 - Zirka (Failed)
	692 - circuits2D_harmonic_foil (Failed)
	693 - circuits2D_harmonic_london (Failed)
	694 - circuits2D_harmonic_massive (Failed)
	695 - circuits2D_harmonic_stranded (Failed)
	696 - circuits2D_harmonic_stranded_explicit_coil_resistance (Failed)
	697 - circuits2D_harmonic_stranded_homogenization (Failed)
	698 - circuits2D_scan_harmonics (Failed)
	699 - circuits2D_transient_foil (Failed)
	700 - circuits2D_transient_london (Failed)
	701 - circuits2D_transient_massive (Failed)
	702 - circuits2D_transient_stranded (Failed)
	703 - circuits2D_transient_variable_resistor (Failed)
	704 - circuits2D_with_hysteresis (Failed)
	705 - circuits2D_with_hysteresis_axi (Failed)
	706 - circuits_harmonic_foil (Failed)
	707 - circuits_harmonic_foil_anl_rotm (Failed)
	708 - circuits_harmonic_foil_wvector (Failed)
	709 - circuits_harmonic_homogenization_coil_solver (Failed)
	710 - circuits_harmonic_massive (Failed)
	711 - circuits_harmonic_stranded (Failed)
	712 - circuits_harmonic_stranded_homogenization (Failed)
	742 - freesurf_maxd_np4 (Failed)
	745 - freesurf_maxd_local_np4 (Failed)
	770 - linearsolvers_cmplx (Failed)
	772 - mgdyn2D_compute_average_b (Failed)
	773 - mgdyn2D_compute_bodycurrent (Failed)
	774 - mgdyn2D_compute_complex_power (Failed)
	775 - mgdyn2D_em (Failed)
	776 - mgdyn2D_em_conforming (Failed)
	777 - mgdyn2D_em_harmonic (Failed)
	778 - mgdyn2D_harmonic_anisotropic_permeability (Failed)
	779 - mgdyn2D_pm (Failed)
	780 - mgdyn2D_scan_homogenization_elementary_solutions (Failed)
	781 - mgdyn2D_steady_wire (Failed)
	782 - mgdyn_3phase (Failed)
	786 - mgdyn_airgap_force_np2 (Failed)
	787 - mgdyn_airgap_harmonic (Failed)
	803 - mgdyn_harmonic (Failed)
	804 - mgdyn_harmonic_loss (Failed)
	805 - mgdyn_harmonic_wire (Failed)
	806 - mgdyn_harmonic_wire_Cgauge (Failed)
	807 - mgdyn_harmonic_wire_Cgauge_automatic (Failed)
	808 - mgdyn_harmonic_wire_impedanceBC (Failed)
	809 - mgdyn_harmonic_wire_impedanceBC2 (Failed)
	811 - mgdyn_lamstack_lowfreq_harmonic (Failed)
	813 - mgdyn_lamstack_widefreq_harmonic (Failed)
	815 - mgdyn_nodalforce2d (Failed)
	818 - mgdyn_steady_coils (Failed)
	824 - mgdyn_steady_quad_extruded_restart (Failed)
	825 - mgdyn_steady_quad_extruded_restart_np3 (Failed)
	832 - mgdyn_thinsheet_harmonic (Failed)
	836 - mgdyn_torus_harmonic (Failed)
	839 - mgdyn_wave_eigen (Failed)
	840 - mydyn_wave_harmonic (Failed)
	866 - pointload2 (Failed)
	872 - radiation (Failed)
	873 - radiation2 (Failed)
	874 - radiation2d (Failed)
	875 - radiation2dAA (Failed)
	876 - radiation2d_deform (Failed)
	877 - radiation2d_spectral (Failed)
	878 - radiation2dsymm (Failed)
	879 - radiation3d (Failed)
	880 - radiation_bin (Failed)
	881 - radiation_dg (Failed)
	882 - radiator2d (Failed)
	883 - radiator3d (Failed)
	884 - radiator3d_box (Failed)
	885 - radiator3d_box2 (Failed)
	886 - radiator3d_open (Failed)
	887 - radiator3d_radiosity (Failed)
	888 - radiator3d_spectral (Failed)
	889 - radiator3d_symm (Failed)

Again. Anything else than trivial. I'm not a programmer tho and I wanted it to compile with OpenMPI and elmerGUI Is there info which tests are supposed to fail? Does elmer operate now as expected with those tests failing?

ghostforest avatar Jul 24 '24 14:07 ghostforest

Hi, I'm also hoping to run Elmer on apple silicon with the GUI enabled.

My system is an M1 running macOS 12.6.3, using:

  • homebrew cmake 3.30.1
  • Apple clang 14.0.0
  • Apple c++ 14.0.0
  • homebrew gfortran 14.0.1

I got a much simpler config to work than @ghostforest 's above:

$ export SDKROOT=/Library/Developer/CommandLineTools/SDKs/MacOSX13.1.sdk
$ MPI_HOME=/opt/homebrew/Cellar/open-mpi/5.0.3_1/ CMAKE_INSTALL_PREFIX=$ELMER_ENV_ROOT/install CFLAGS="-std=gnu89" CMAKE_PREFIX_PATH=/opt/homebrew/Cellar/qt@5//5.15.13_1/lib/cmake cmake -DWITH_QT5:BOOL=ON -DWITH_QWT:BOOL=OFF -DWITH_ELMERGUI:BOOL=ON -DWITH_PARAVIEW:BOOL=ON ../
$ make -j4 install
(SDKROOT solves a bizarre compile error, click to see full message)

cmake found the correct header file location in CMAKE_OSX_SYSROOT, but something else in the build system ignored that in favor of :shrug: :

$ make install
[...]
14 warnings generated.
[  1%] Building C object matc/src/CMakeFiles/matc.dir/str.c.o
[  1%] Building C object matc/src/CMakeFiles/matc.dir/urand.c.o
[  1%] Building C object matc/src/CMakeFiles/matc.dir/variable.c.o
[  1%] Linking C shared library libmatc.dylib
[  1%] Built target matc
[  1%] Building C object matc/src/CMakeFiles/Matc_bin.dir/main.c.o
[  1%] Linking C executable matc
[  1%] Built target Matc_bin
[  1%] Built target umfpack_srcs
[  1%] Generating umfpack_zl_wsolve.c
In file included from [path/to]/elmer/elmerfem/umfpack/src/umfpack/umfpack_solve.c:31:
[path/to]/elmer/elmerfem/umfpack/src/umfpack/include/umf_internal.h:29:10: fatal error: 'string.h' file not found
#include <string.h>
         ^~~~~~~~~~
1 error generated.
make[2]: *** [umfpack/src/umfpack/umfpack_zl_wsolve.c] Error 1
make[2]: *** Deleting file `umfpack/src/umfpack/umfpack_zl_wsolve.c'
make[1]: *** [umfpack/src/umfpack/CMakeFiles/umfpack.dir/all] Error 2
make: *** [all] Error 2

but am also seeing failing tests:

93% tests passed, 69 tests failed out of 922
click for a full list
[...]
The following tests FAILED:
	124 - CurvedBoundaryCylH_np3 (Failed)
	125 - CurvedBoundaryCylH_np8 (Failed)
	243 - FixTangentVelo (Failed)
	249 - HarmonicNS (Failed)
	264 - HelmholtzFEM (Failed)
	271 - HelmholtzStructure (Failed)
	272 - HelmholtzStructure2 (Failed)
	273 - HelmholtzStructure3 (Failed)
	275 - Hybrid2dMeshPartitionMetis_np8 (Failed)
	276 - Hybrid2dMeshPartitionMetisConnect_np8 (Failed)
	278 - HydrostaticNSVec-ISMIP-HOM-C (Failed)
	286 - InductionHeating3 (Failed)
	339 - MazeMeshPartitionMetisContig_np6 (Failed)
	430 - PlatesEigenComplex (Failed)
	431 - PlatesHarmonic (Failed)
	504 - SD_HarmonicNS (Failed)
	512 - SD_P2ndDerivatives (Failed)
	567 - Shell_with_Solid_Beam_EigenComplex (Failed)
	573 - ShoeboxFsiHarmonicPlate (Failed)
	574 - ShoeboxFsiStatic (Failed)
	576 - ShoeboxFsiStaticShell (Failed)
	611 - StressConstraintModes3 (Failed)
	637 - VectorHelmholtzImpMatrix (Failed)
	638 - VectorHelmholtzWaveguide (Failed)
	639 - VectorHelmholtzWaveguide2 (Failed)
	640 - VectorHelmholtzWaveguide3 (Failed)
	641 - VectorHelmholtzWaveguide4 (Failed)
	643 - VectorHelmholtzWaveguideNodal (Failed)
	644 - VectorHelmholtzWaveguideQuadBlock (Failed)
	645 - VectorHelmholtzWaveguide_TM (Failed)
	655 - WinkelPartitionMetis_np8 (Failed)
	656 - WinkelPartitionMetisConnect_np8 (Failed)
	657 - WinkelPartitionMetisRec_np8 (Failed)
	697 - circuits2D_harmonic_stranded_homogenization (Failed)
	698 - circuits2D_scan_harmonics (Failed)
	706 - circuits_harmonic_foil (Failed)
	707 - circuits_harmonic_foil_anl_rotm (Failed)
	708 - circuits_harmonic_foil_wvector (Failed)
	709 - circuits_harmonic_homogenization_coil_solver (Failed)
	710 - circuits_harmonic_massive (Failed)
	711 - circuits_harmonic_stranded (Failed)
	712 - circuits_harmonic_stranded_homogenization (Failed)
	742 - freesurf_maxd_np4 (Failed)
	745 - freesurf_maxd_local_np4 (Failed)
	770 - linearsolvers_cmplx (Failed)
	772 - mgdyn2D_compute_average_b (Failed)
	773 - mgdyn2D_compute_bodycurrent (Failed)
	774 - mgdyn2D_compute_complex_power (Failed)
	777 - mgdyn2D_em_harmonic (Failed)
	778 - mgdyn2D_harmonic_anisotropic_permeability (Failed)
	780 - mgdyn2D_scan_homogenization_elementary_solutions (Failed)
	782 - mgdyn_3phase (Failed)
	786 - mgdyn_airgap_force_np2 (Failed)
	787 - mgdyn_airgap_harmonic (Failed)
	803 - mgdyn_harmonic (Failed)
	804 - mgdyn_harmonic_loss (Failed)
	805 - mgdyn_harmonic_wire (Failed)
	806 - mgdyn_harmonic_wire_Cgauge (Failed)
	807 - mgdyn_harmonic_wire_Cgauge_automatic (Failed)
	808 - mgdyn_harmonic_wire_impedanceBC (Failed)
	809 - mgdyn_harmonic_wire_impedanceBC2 (Failed)
	811 - mgdyn_lamstack_lowfreq_harmonic (Failed)
	813 - mgdyn_lamstack_widefreq_harmonic (Failed)
	818 - mgdyn_steady_coils (Failed)
	832 - mgdyn_thinsheet_harmonic (Failed)
	836 - mgdyn_torus_harmonic (Failed)
	839 - mgdyn_wave_eigen (Failed)
	840 - mydyn_wave_harmonic (Failed)
	866 - pointload2 (Failed)

I notice that the most recent full-test Action on the primary branch had 5 failing tests, so perhaps some failing tests isn't a dealbreaker, but 10x-25x that feels suspicious.

Regardless, I got a tutorial to run all the way through, so if you (or any future travelers) need an exemplar, the above can serve.

krivard avatar Jul 26 '24 15:07 krivard

Thanks for that addition. Yes mine is quite complicated. What does -DWITH_PARAVIEW:BOOL=ON do? Is your elmer multicore capable without setting OpenMP=TRUE?

make -j$(sysctl -n hw.logicalcpu) This will utilize the full cpu for compiling. Same for ctest: ctest -j$(sysctl -n hw.logicalcpu)

Btw your command does not work for me.

Interesting is that enabling Paraview will lead to more tests failing later. Same with MPI and OpenMP.

ghostforest avatar Jul 26 '24 15:07 ghostforest

Paraview is a postprocessor/visualization engine that lets you display the results of a solve; it's referenced in the GUI tutorials (http://www.nic.funet.fi/pub/sci/physics/elmer/doc/ElmerTutorials.pdf) and has to be installed separately (https://www.paraview.org/download/) in such a way that running paraview from the command line will start the application. Elmer support for it turns out to primarily be a convenience; it adds a button to the Elmer GUI that starts up Paraview with your solve vtu file instead of you having to go open Paraview and open the vtu yourself.

As far as multicore goes, parallelization does work and does help, but I'm not super familiar with the differences between Open MPI, OpenMP, and MPI and I'm not certain it's strictly speaking using OpenMP to do it. When I do Run -> Parallel settings -> Use parallel solver, it claims to be using mpirun, and my solve times do indeed drop. Hopefully that gives you the information you need -- if not I'm happy to send output from diagnostic commands if you like.

krivard avatar Jul 26 '24 19:07 krivard

Afaik MPI is for clustering multiple machines whereas OpenMP is a parallelization library for a single machine. It is a million times easier to implement than pthread since the parallelization of OpenMP happens at compile time. Pthread provides more control but requires much more code

ghostforest avatar Jul 26 '24 20:07 ghostforest

MPI is historically the way to go with Elmer (and many other FEM codes). Once the problem is distributed the parallelization is trivial (meaning all the physics, assembly etc.). Only solution of linear system requires communication but this is more or less transparent from the user. There is some stuff that has not been implemented with MPI (e.g. viewfactors and general mortar methods). OpenMP is easier but should be done at every bottle-neck. If you consider that there is 500 k lines of code you can understand that it has not been implemented everywhere and lots of code is run on single thread.

raback avatar Aug 02 '24 13:08 raback

Yes! Nobody is complaining about the state of elmer or that it should provide more parallelization. The discussion was: "how to compile elmer with parallelization capabilities on mac os", but that at least for my part was solved. Elmer compiles with mpi and openmp tho more tests fail than without the parallelizations compiled. I guess some are meant to fail and some are maybe small errors in calculation due to different libraries on os x I guess. Since apple has to do everything differently...

Thank you for the insight this was very interesting to read.

ghostforest avatar Aug 02 '24 13:08 ghostforest

Elmer is now being built regularly on macOS (Intel and Apple Silicon) in CI: https://github.com/ElmerCSC/elmerfem/blob/devel/.github/workflows/build-macos-homebrew.yaml

More tests are passing when using OpenBLAS instead of Apple Accelerate. Additionally, make sure that you don't exceed the number of physical cores when using MPI (e.g., -DMPI_TEST_MAXPROC=2 for the CI runners).

mmuetzel avatar Sep 06 '24 07:09 mmuetzel

I attempted to install Elmer as a dependency of FreeCAD. It does not run on macOS Apple silicon, despite what the CI scripts suggest. There is always some sort of error with combining x86 and arm64 bitcode into the same executable. This is my second time attempting, this time directly copying the CI script. I have given up. I will have to use alternative means to simulate electrostatics, such as analogy to the heat equation in a FEM program that compiles and runs.

https://gist.github.com/philipturner/2240a80d2d16f793fb3a25d56499203b

Image

philipturner avatar Feb 05 '25 02:02 philipturner

In case someone is still struggling to get Elmer to work, here's a full script:

#!/bin/bash

# Ensure Homebrew is up to date
brew update

# Install necessary dependencies
brew install --overwrite [email protected] [email protected]
brew reinstall gcc
brew install cmake openblas open-mpi qwt vtk opencascade

# Define build directory
BUILD_DIR="$HOME/elmerfem-build"
SOURCE_DIR="$HOME/elmerfem" # Replace with the actual path to your ElmerFEM source

# Create and navigate to the build directory
mkdir -p "$BUILD_DIR"
cd "$BUILD_DIR"

# Configure CMake (without OpenMP)
cmake \
  -DCMAKE_BUILD_TYPE=Release \
  -DCMAKE_C_COMPILER=clang \
  -DCMAKE_CXX_COMPILER=clang++ \
  -DCMAKE_Fortran_COMPILER=gfortran \
  -DCMAKE_INSTALL_PREFIX="$HOME/elmerfem-install" \
  -DBLA_VENDOR="OpenBLAS" \
  -DUSE_MACOS_PACKAGE_MANAGER=OFF \
  -DCMAKE_PREFIX_PATH="$(brew --prefix)/opt/openblas;$(brew --prefix)/opt/qt;$(brew --prefix)/opt/qwt" \
  -DWITH_OpenMP=OFF \
  -DWITH_LUA=ON \
  -DWITH_MPI=ON \
  -DMPI_TEST_MAXPROC=2 \
  -DWITH_Zoltan=OFF \
  -DWITH_Mumps=OFF \
  -DWITH_CHOLMOD=OFF \
  -DWITH_ElmerIce=ON \
  -DWITH_ELMERGUI=ON \
  -DWITH_QT6=ON \
  -DQWT_INCLUDE_DIR="$(brew --prefix)/opt/qwt/lib/qwt.framework/Headers" \
  -DWITH_VTK=ON \
  -DWITH_OCC=ON \
  -DWITH_MATC=ON \
  -DWITH_PARAVIEW=ON \
  -DCREATE_PKGCONFIG_FILE=ON \
  "$SOURCE_DIR"

# Build ElmerFEM
cmake --build . -j$(sysctl -n hw.logicalcpu)

# Install ElmerFEM
cmake --install .

# Run CTest (excluding slow tests)
CTEST_OUTPUT_ON_FAILURE=1 ctest . -LE slow -j$(sysctl -n hw.logicalcpu) --timeout 300

None of the tests failed.

jaydeshpande avatar Mar 12 '25 21:03 jaydeshpande

The issue stems from an OS update. The build error comes from an executable in the recently updated compiler toolchain. It works on older versions of macOS, as evidenced by the GitHub workflows running successfully. It does not run on the newest version (15). What was the major and minor version of your OS update, when you ran the code above?

philipturner avatar Mar 13 '25 02:03 philipturner

I am on 15.3

The installation script ran without any hiccups. Note that the script is derived from Elmer's Github test script.

jaydeshpande avatar Mar 13 '25 13:03 jaydeshpande

It's still failing to build, with the exact same error as before. Note that I had to disable MPI, QT6, and features that depend on them, to get past the errors in # Configure CMake (without OpenMP).

#!/bin/bash

## Ensure Homebrew is up to date
#brew update
#
## Install necessary dependencies
#brew install --overwrite [email protected] [email protected]
#brew reinstall gcc
#brew install cmake openblas open-mpi qwt vtk opencascade
#
## Download from Git
#git clone --single-branch --branch devel https://github.com/ElmerCSC/elmerfem

# Define build directory
BUILD_DIR="$(pwd)/elmerfem-build"
SOURCE_DIR="$(pwd)/elmerfem"
INSTALL_DIR="$(pwd)/elmerfem-install"

# Create and navigate to the build directory
mkdir "$BUILD_DIR"
cd "$BUILD_DIR"

# Configure CMake (without OpenMP)
cmake \
  -DCMAKE_BUILD_TYPE=Release \
  -DCMAKE_C_COMPILER=clang \
  -DCMAKE_CXX_COMPILER=clang++ \
  -DCMAKE_Fortran_COMPILER=gfortran \
  -DCMAKE_INSTALL_PREFIX="$INSTALL_DIR" \
  -DBLA_VENDOR="OpenBLAS" \
  -DUSE_MACOS_PACKAGE_MANAGER=OFF \
  -DCMAKE_PREFIX_PATH="$(brew --prefix)/opt/openblas;$(brew --prefix)/opt/qt;$(brew --prefix)/opt/qwt" \
  -DWITH_OpenMP=OFF \
  -DWITH_LUA=ON \
  -DWITH_MPI=OFF \
  -DMPI_TEST_MAXPROC=2 \
  -DWITH_Zoltan=OFF \
  -DWITH_Mumps=OFF \
  -DWITH_CHOLMOD=OFF \
  -DWITH_ElmerIce=OFF \
  -DWITH_ELMERGUI=OFF \
  -DWITH_QT6=OFF \
  -DQWT_INCLUDE_DIR="$(brew --prefix)/opt/qwt/lib/qwt.framework/Headers" \
  -DWITH_VTK=OFF \
  -DWITH_OCC=ON \
  -DWITH_MATC=ON \
  -DWITH_PARAVIEW=OFF \
  -DCREATE_PKGCONFIG_FILE=ON \
  "$SOURCE_DIR"

# Build ElmerFEM
cmake --build . -j1
#
## Install ElmerFEM
#cmake --install .
#
## Run CTest (excluding slow tests)
#CTEST_OUTPUT_ON_FAILURE=1 ctest . -LE slow -j$(sysctl -n hw.perflevel0.physicalcpu) --timeout 300

Errors:

[ 41%] Built target arpack
Scanning dependencies of target binio
Consolidate compiler generated dependencies of target binio
[ 41%] Building C object fem/src/binio/CMakeFiles/binio.dir/binio.c.o
[ 42%] Linking Fortran static library libbinio.a
/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/ranlib: archive member: libbinio.a(biniomod.f90.o) cputype (16777228) does not match previous archive members cputype (16777223) (all members must match)
/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/ranlib: archive member: libbinio.a(kinds.f90.o) cputype (16777228) does not match previous archive members cputype (16777223) (all members must match)
/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/ranlib: archive member: libbinio.a(biniomod.f90.o) cputype (16777228) does not match previous archive members cputype (16777223) (all members must match)
/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/ranlib: archive member: libbinio.a(kinds.f90.o) cputype (16777228) does not match previous archive members cputype (16777223) (all members must match)
make[2]: *** [fem/src/binio/libbinio.a] Error 1
make[2]: *** Deleting file `fem/src/binio/libbinio.a'
make[1]: *** [fem/src/binio/CMakeFiles/binio.dir/all] Error 2
make: *** [all] Error 2

Are you sure you are on an Apple silicon Mac running 15.3, and not an Intel Mac running 15.3?

I am on 15.5 now.

philipturner avatar Jun 27 '25 13:06 philipturner

I just realized that the person opening this issue got it to work via Rosetta. I don’t care about speed that much, mostly whether it works at all. Even something originally compiled in x86_64 instructions would be good enough.

As the compiler error stems from a mismatch between CPU architectures (arm64 vs x86_64), I assume the person who got it working was using an Intel Mac with macOS 15.3.

philipturner avatar Jul 02 '25 01:07 philipturner

@raback, I successfully installed and ran Elmer using the Apple M3 Max Chip (Apple Silicon) by replacing openmpi with mpich. Would it be beneficial to update the compilation instructions for macOS Apple Silicon in a pull request?

Raunak-Singh-Inventor avatar Jul 11 '25 23:07 Raunak-Singh-Inventor

@raback, I successfully installed and ran Elmer using the Apple M3 Max Chip (Apple Silicon) by replacing openmpi with mpich. Would it be beneficial to update the compilation instructions for macOS Apple Silicon in a pull request?

Could you provide the exact Bash script you used?

philipturner avatar Jul 12 '25 00:07 philipturner

this is basically how I got it to work:

sudo port install mpich
brew install qt@5 qwt-qt5 cmake
sudo port install gcc13

# Install MUMPs
git clone https://github.com/scivision/mumps.git
cd mumps
cmake -Dparallel=true -Darith=d -Dmetis=true -B build
cmake --build build
mkdir lib
cp build/*.a lib/
ln -s build/mumps-src/include include
cd ..

git clone https://github.com/ElmerCSC/elmerfem
cd elmerfem
rm -rf build
mkdir build
cd build

# not sure if need below line
export PATH="/opt/homebrew/opt/qt@5/bin:$PATH"

cmake .. \
  -DCMAKE_BUILD_TYPE=Release \
  -DCMAKE_INSTALL_PREFIX=$HOME/elmer \
  -DWITH_ELMERGUI=TRUE \
  -DWITH_QT5=TRUE \
  -DWITH_MPI=TRUE \
  -DHOMEBREW_PREFIX=/opt/homebrew \
  -DCMAKE_Fortran_COMPILER=/opt/local/bin/gfortran-mp-13 \
  -DCMAKE_C_COMPILER=clang \
  -DCMAKE_CXX_COMPILER=clang++ \
  -DMPI_C_COMPILER=/opt/homebrew/bin/mpicc \
  -DMPI_CXX_COMPILER=/opt/homebrew/bin/mpicxx \
  -DCMAKE_PREFIX_PATH="/opt/homebrew/opt/qt@5;/opt/homebrew/opt/qwt-qt5" \
  -DQWT_INCLUDE_DIR=/opt/homebrew/opt/qwt-qt5/lib/qwt.framework/Headers \
  -DQWT_LIBRARY=/opt/homebrew/opt/qwt-qt5/lib/qwt.framework/qwt

make -j$(sysctl -n hw.logicalcpu)
make install

# Run Elmer GUI
export DYLD_FRAMEWORK_PATH=/opt/homebrew/opt/qt@5/lib
export QT_PLUGIN_PATH=/opt/homebrew/opt/qt@5/plugins
~/elmer/bin/ElmerGUI

Raunak-Singh-Inventor avatar Jul 12 '25 03:07 Raunak-Singh-Inventor

I got the first 3 lines to work, but lots of errors with MUMPS and MPICH. I tried all paths, including /opt/homebrew/bin/mpicc, /opt/local/bin/mpicc-mpich-mp, and the three suggestions from Google AI overview:

  • cmake -DMPI_C_COMPILER=/path/to/mpicc -DMPI_C_INCLUDE_PATH=/path/to/mpi/include ..
  • export MPI_HOME=/path/to/mpi/installation
  • cmake -DCMAKE_PREFIX_PATH=/path/to/mpi/lib ..
(base) philipturner@macbookpro mumps % cmake -Dparallel=true -Darith=d -Dmetis=true -B build
-- MUMPS 5.8.0.3 upstream 5.8.0 install prefix: /usr/local
-- CMake 3.20.0  Toolchain 
-- MUMPS source URL: https://mumps-solver.org/MUMPS_5.8.0.tar.gz
-- Could NOT find MPI_C (missing: MPI_C_WORKS) 
CMake Error at /usr/local/Cellar/cmake/3.20.0/share/cmake/Modules/FindPackageHandleStandardArgs.cmake:230 (message):
  Could NOT find MPI (missing: MPI_C_FOUND C) (found version "3.1")
Call Stack (most recent call first):
  /usr/local/Cellar/cmake/3.20.0/share/cmake/Modules/FindPackageHandleStandardArgs.cmake:594 (_FPHSA_FAILURE_MESSAGE)
  /usr/local/Cellar/cmake/3.20.0/share/cmake/Modules/FindMPI.cmake:1741 (find_package_handle_standard_args)
  CMakeLists.txt:48 (find_package)


-- Configuring incomplete, errors occurred!
See also "/Users/philipturner/Documents/Elmer/mumps/build/CMakeFiles/CMakeOutput.log".
See also "/Users/philipturner/Documents/Elmer/mumps/build/CMakeFiles/CMakeError.log".

MPICH is a version of MPI used for supercomputers, but a Mac isn't a supercomputer. Wouldn't it do just fine with open-mpi from Homebrew? Previous people on this thread were not using MacPorts.

philipturner avatar Jul 15 '25 18:07 philipturner

MPICH also doesn't work with Elmer. What's different on your Mac that makes it behave different than my Mac?

cmake .. \
  -DCMAKE_BUILD_TYPE=Release \
  -DCMAKE_INSTALL_PREFIX=/Users/philipturner/Documents/Elmer/elmer \
  -DWITH_ELMERGUI=TRUE \
  -DWITH_QT5=TRUE \
  -DWITH_MPI=TRUE \
  -DHOMEBREW_PREFIX=/opt/homebrew \
  -DCMAKE_Fortran_COMPILER=/opt/local/bin/gfortran-mp-13 \
  -DCMAKE_C_COMPILER=clang \
  -DCMAKE_CXX_COMPILER=clang++ \
  -DMPI_C_COMPILER=/opt/homebrew/bin/mpicc \
  -DMPI_CXX_COMPILER=/opt/homebrew/bin/mpicxx \
  -DCMAKE_PREFIX_PATH="/opt/homebrew/opt/qt@5;/opt/homebrew/opt/qwt-qt5" \
  -DQWT_INCLUDE_DIR=/opt/homebrew/opt/qwt-qt5/lib/qwt.framework/Headers \
  -DQWT_LIBRARY=/opt/homebrew/opt/qwt-qt5/lib/qwt.framework/qwt

(base) philipturner@macbookpro build % bash ../../script.sh
-- The Fortran compiler identification is GNU 13.3.0
-- The C compiler identification is AppleClang 17.0.0.17000013
-- The CXX compiler identification is AppleClang 17.0.0.17000013
-- Checking whether Fortran compiler has -isysroot
-- Checking whether Fortran compiler has -isysroot - yes
-- Checking whether Fortran compiler supports OSX deployment target flag
-- Checking whether Fortran compiler supports OSX deployment target flag - yes
-- Detecting Fortran compiler ABI info
-- Detecting Fortran compiler ABI info - done
-- Check for working Fortran compiler: /opt/local/bin/gfortran-mp-13 - skipped
-- Checking whether /opt/local/bin/gfortran-mp-13 supports Fortran 90
-- Checking whether /opt/local/bin/gfortran-mp-13 supports Fortran 90 - yes
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Check for working C compiler: /usr/bin/clang - skipped
-- Detecting C compile features
-- Detecting C compile features - done
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Check for working CXX compiler: /usr/bin/clang++ - skipped
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Checking whether GFortran version >= 7 -- yes
-- Could NOT find MPI_C (missing: MPI_C_WORKS) 
-- Could NOT find MPI_CXX (missing: MPI_CXX_WORKS) 
-- Found MPI_Fortran: /opt/homebrew/Cellar/open-mpi/5.0.7/lib/libmpi_usempif08.dylib (found version "3.1") 
CMake Error at /usr/local/Cellar/cmake/3.20.0/share/cmake/Modules/FindPackageHandleStandardArgs.cmake:230 (message):
  Could NOT find MPI (missing: MPI_C_FOUND MPI_CXX_FOUND) (found version
  "3.1")
Call Stack (most recent call first):
  /usr/local/Cellar/cmake/3.20.0/share/cmake/Modules/FindPackageHandleStandardArgs.cmake:594 (_FPHSA_FAILURE_MESSAGE)
  /usr/local/Cellar/cmake/3.20.0/share/cmake/Modules/FindMPI.cmake:1741 (find_package_handle_standard_args)
  CMakeLists.txt:219 (FIND_PACKAGE)


-- Configuring incomplete, errors occurred!
See also "/Users/philipturner/Documents/Elmer/elmerfem/build/CMakeFiles/CMakeOutput.log".
See also "/Users/philipturner/Documents/Elmer/elmerfem/build/CMakeFiles/CMakeError.log".
(base) philipturner@macbookpro build % 

The large cmake line works when I add -DWITH_MPI=OFF.

philipturner avatar Jul 15 '25 18:07 philipturner

I still get the exact same error as before. It cannot be true that two people straight-up got Elmer to compile with no special trickery.

[ 36%] Built target mpi_stubs
Scanning dependencies of target binio
[ 36%] Building Fortran object fem/src/binio/CMakeFiles/binio.dir/kinds.f90.o
[ 36%] Building Fortran object fem/src/binio/CMakeFiles/binio.dir/biniomod.f90.o
[ 36%] Building C object fem/src/binio/CMakeFiles/binio.dir/binio.c.o
[ 36%] Linking Fortran static library libbinio.a
/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/ranlib: archive member: libbinio.a(biniomod.f90.o) cputype (16777228) does not match previous archive members cputype (16777223) (all members must match)
/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/ranlib: archive member: libbinio.a(kinds.f90.o) cputype (16777228) does not match previous archive members cputype (16777223) (all members must match)
/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/ranlib: archive member: libbinio.a(biniomod.f90.o) cputype (16777228) does not match previous archive members cputype (16777223) (all members must match)
/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/ranlib: archive member: libbinio.a(kinds.f90.o) cputype (16777228) does not match previous archive members cputype (16777223) (all members must match)
make[2]: *** [fem/src/binio/libbinio.a] Error 1
make[2]: *** Deleting file `fem/src/binio/libbinio.a'
make[1]: *** [fem/src/binio/CMakeFiles/binio.dir/all] Error 2
make: *** [all] Error 2

philipturner avatar Jul 15 '25 18:07 philipturner

I did the special x86_64 stuff to make Rosetta do its thing. Note that this command creates a different terminal appearance, looking like macbookpro:~ philipturner$. Might misleadingly imply that we've switched to an x86 "environment" that solves the cputype issue that was always the problem, ever since the beginning of this thread.

Last login: Tue Jul 15 14:19:09 on ttys002
(base) philipturner@macbookpro ~ % echo "a command"
(base) philipturner@macbookpro ~ % echo "a command"
(base) philipturner@macbookpro ~ % arch -x86_64 /usr/bin/env bash --login
macbookpro:~ philipturner$ echo "a command"
macbookpro:~ philipturner$ echo "a command"
macbookpro:~ philipturner$ 

Surely, x86_64 cross-compilation will make it work on an average person's Mac. You can see the macbookpro:build philipturner$ at the end as proof.

[ 35%] Building Fortran object mathlibs/src/arpack/CMakeFiles/arpack.dir/zmout.f.o
[ 35%] Building Fortran object mathlibs/src/arpack/CMakeFiles/arpack.dir/zvout.f.o
[ 35%] Linking Fortran shared library libarpack.dylib
[ 37%] Built target arpack
Consolidate compiler generated dependencies of target binio
[ 37%] Linking Fortran static library libbinio.a
/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/ranlib: archive member: libbinio.a(biniomod.f90.o) cputype (16777228) does not match previous archive members cputype (16777223) (all members must match)
/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/ranlib: archive member: libbinio.a(kinds.f90.o) cputype (16777228) does not match previous archive members cputype (16777223) (all members must match)
/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/ranlib: archive member: libbinio.a(biniomod.f90.o) cputype (16777228) does not match previous archive members cputype (16777223) (all members must match)
/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/ranlib: archive member: libbinio.a(kinds.f90.o) cputype (16777228) does not match previous archive members cputype (16777223) (all members must match)
make[2]: *** [fem/src/binio/libbinio.a] Error 1
make[2]: *** Deleting file `fem/src/binio/libbinio.a'
make[1]: *** [fem/src/binio/CMakeFiles/binio.dir/all] Error 2
make: *** [all] Error 2
macbookpro:build philipturner$ 

I've had enough of this dark magic. Elmer wasn't designed for macOS, so it flat out doesn't work on modern macOS versions. Software systems are giant and complex. If you don't design for a particular use case from the beginning, it won't be supported. I'm used to software that works cross-platform without issues. Fed up with another repository of GNU/Fortran code that compiles on Mac but not Windows.

I'm going to install FreeCAD on Windows and get Elmer working there, to avoid this issue and move on. Hopefully the macOS compilation issues will be fixed by the time my project is done. While I can proceed with my project, the average person suffers from not having access to the software required.

philipturner avatar Jul 15 '25 18:07 philipturner