rcps-buildscripts
rcps-buildscripts copied to clipboard
Install Request: Intel OneAPI Compilers, MPI and tools [IN05406840] [IN05408412]
User needs Intel compilers and tools newer than 2021 on Young plus a version of HDF5 to use with them eg: intel-oneapi-compilers-2022.0.1, intel-oneapi-mpi-2021.5.0 etc.
https://www.intel.com/content/www/us/en/developer/tools/oneapi/toolkits.html#gs.66tbji
Also had a request for installing on Kathleen from IN05408412.
Need to install the oneAPI Base Toolkit and then the oneAPI HPC Toolkit.
https://www.intel.com/content/www/us/en/develop/documentation/installation-guide-for-intel-oneapi-toolkits-hpc-cluster/top/introduction.html
Trying test install in /home/ccspapp/Scratch/intel-2022.2
on Young.
Base part finished:
Installed Location: /home/ccspapp/Scratch/intel-2022.2/install
Installation has successfully completed
Log file: /home/ccspapp/Scratch/intel-2022.2/install/logs/installer.install.intel.oneapi.lin.basekit.product,v=2022.2.0-262.2022.08.16.16.02.39.000566.log
Remove extracted files: /home/ccspapp/Scratch/intel-2022.2/l_BaseKit_p_2022.2.0.262...
HPC part also done:
Installed Location: /home/ccspapp/Scratch/intel-2022.2/install
Installation has successfully completed
Log file: /home/ccspapp/Scratch/intel-2022.2/install/logs/installer.install.intel.oneapi.lin.hpckit.product,v=2022.2.0-191.2022.08.16.17.04.58.168355.log
Remove extracted files: /home/ccspapp/Scratch/intel-2022.2/l_HPCKit_p_2022.2.0.191...
Need to check if these still need the compiler interfaces building separately, test it, and sort out what modules should look like. (The components do all have modulefiles and there is a modulefiles-setup.sh
which puts those all into one directory, but we have additional bits and pieces so it is probably best to get the relevant environment variables from those and create ours).
Intel version numbers are as ever confusing: the 2022.2 install scripts install compiler/2022.1.0
and mpi/2021.6.0
directories.
I think it has done all the interfaces - we've got libmkl_blacs_, libmkl_blas95_, libmkl_cdft_core, libmkl_lapack95_ libraries.
Leaving dnnl, the Deep Neural Network Library out of the main compiler module vars because it has four conflicting possible modules to choose from.
dnnl
dnnl-cpu-gomp
dnnl-cpu-iomp
dnnl-cpu-tbb
Updated the modulemaker buildscript lines for this version, should now have test modules that can be used.
- [x] test a compilation
- [x] test an mpi compilation
Not entirely sure if I should leave gcc-libs/10.2.0 as the prereq for the compiler module - it needs at least gcc 7.3.0 and some compilation options require gcc >= 8.1. Could do a warning on module load if gcc-libs 4.9.2 is loaded instead.
ETA: We decided for now to just leave the requirement of gcc-libs/10.2.0 to reduce problems for users. New software stack should replace (and will probably be gcc-11).
I have tested pi_examples! c_pi, fortran_pi, c_mpi_pi and fortran_mpi_pi all work, hooray. (Inside one node).
The test install and modules are in:
/home/ccspapp/Scratch/intel-2022.2/
The pi builds are in:
/home/ccspapp/Scratch/pi_builds/intel-2022
To test:
module purge
module load beta-modules
module load gcc-libs/10.2.0
module load gerun
module use /home/ccspapp/Scratch/intel-2022.2/compilers
module use /home/ccspapp/Scratch/intel-2022.2/mpi
module load compilers/intel/2022.2
module load mpi/intel/2021.6.0/intel
Then cd
into relevant pi subdirectory and make
or make intel
(only for the fortran_pi).
Will submit a two-node job with the mpi versions.
Two node c_mpi_pi worked on Young!
- [x] Young install
- [x] Kathleen install
- [x] Thomas install
- [x] Myriad install
- [x] modulefiles available
Other installs are going.
To use:
module unload -f compilers mpi gcc-libs
module load beta-modules
module load gcc-libs/10.2.0
module load compilers/intel/2022.2
module load mpi/intel/2021.6.0/intel
Checking on version of HDF5 needed, current newest is 1.12.2 (serial or MPI version, and they changed the API in 1.12).
Ticket IN:05406840 (inc HDF5 as lower priority) was auto-closed, was cloned and replied to in IN:05492776.
HDF5 1.12.2 should be fine. Probably the serial version as it doesn't mention MPI, but doing both should be straightforward (they are separate modules).
Think this was all done, hdf/5-1.12.3-impi/intel-2022
exists.