rcps-buildscripts
rcps-buildscripts copied to clipboard
Install Request: Upgrade R to 4.2.0 and Bioconductor to 3.15 [IN05264358]
The latest R version currently on the clusters (R 4.1.1) cannot install the nlmixr package (Nonlinear Mixed Effects Models in Population PK/PD) as it depends on at least one other package (symengine) that requires at least R 4.2. This type of problem will become an increasing problem with older R versions.
- R 4.2.0 was released on 22nd April 2022 - https://cran.r-project.org/
- Bioconductor 3.15 was released on 27th April 2022 - http://bioconductor.org/
Note: This is the R version we should have as the default on RStudio from September 2022.
Linked to Jira issue:
https://ucldata.atlassian.net/browse/RCE-877
Building base R on Myriad first.
Build script updated and uploaded to Myriad.
Building base R and recommended packages using:
cd /shared/ucl/apps/build_scripts
module unload -f compilers mpi gcc-libs
module load beta-modules
module load gcc-libs/10.2.0
module load compilers/gnu/10.2.0
module load openblas/0.3.13-serial/gnu-10.2.0
module load java/1.8.0_92
module load fftw/3.3.9/gnu-10.2.0
module load ghostscript/9.19/gnu-4.9.2
module load texinfo/6.6/gnu-4.9.2
module load texlive/2019
module load gsl/2.7/gnu-10.2.0
module load hdf/5-1.10.6/gnu-10.2.0
module load udunits/2.2.28/gnu-10.2.0
module load netcdf/4.8.1/gnu-10.2.0
module load pcre2/10.37/gnu-10.2.0
./R-4.2.0_install 2>&1 | tee ~/Software/R/R-4.2.0_install.log-25052022
Build has completed. Looks OK so will make a module ready for the next stage of the build.
Building additional packages using:
./R-4.2.0_packages_install 2>&1 | tee ~/Software//R//R-4.2.0_packages_install.log-25052022
Note: this will probably need running more than once.
Myriad is now running the extra R packages build after a couple of minor issues - I had forgotten that Myriad needs extra modules for OpenMPI 4.0.5 and I had to remove the old adapt package as it fails to compile now (it is no longer on CRAN anyway).
It has failed at attempting to build another old package not on CRAN (Design). Will remove and rerun tomorrow.
Additional R and Bioconductor packages build completed. Now checking the log for problems.
Possible problems:
readelf: Error: /lustre/shared/ucl/apps/R/R-4.2.0-OpenBLAS/lib64/R/library/00LOCK-RcppParallel/00new/RcppParallel/lib/libtbb.so: Failed to read file header
readelf: Error: /lustre/shared/ucl/apps/R/R-4.2.0-OpenBLAS/lib64/R/library/00LOCK-RcppParallel/00new/RcppParallel/lib/libtbbmalloc.so: Failed to read file header
readelf: Error: /lustre/shared/ucl/apps/R/R-4.2.0-OpenBLAS/lib64/R/library/00LOCK-RcppParallel/00new/RcppParallel/lib/libtbbmalloc_proxy.so: Failed to read file header
** testing if installed package can be loaded from final location
** testing if installed package keeps a record of temporary installation path
* DONE (RcppParallel)
Install RcppParallel again to check.
* installing *source* package âprotoliteâ ...
** package âprotoliteâ successfully unpacked and MD5 sums checked
** using staged installation
Package protobuf was not found in the pkg-config search path.
Perhaps you should add the directory containing `protobuf.pc'
to the PKG_CONFIG_PATH environment variable
No package 'protobuf' found
Using PKG_CFLAGS=
Using PKG_LIBS=-lprotobuf
<stdin>:1:10: fatal error: google/protobuf/message.h: No such file or directory
compilation terminated.
------------------------- ANTICONF ERROR ---------------------------
Configuration failed because protobuf was not found. Try installing:
* deb: libprotobuf-dev (Debian, Ubuntu, etc)
* rpm: protobuf-devel (Fedora, EPEL)
* csw: protobuf_dev (Solaris)
* brew: protobuf (OSX)
If protobuf is already installed, check that 'pkg-config' is in your
PATH and PKG_CONFIG_PATH contains a protobuf.pc file. If pkg-config
is unavailable you can set INCLUDE_DIR and LIB_DIR manually via:
R CMD INSTALL --configure-vars='INCLUDE_DIR=... LIB_DIR=...'
--------------------------------------------------------------------
ERROR: configuration failed for package âprotoliteâ
This one needs fixing because:
ERROR: dependency âprotoliteâ is not available for package âgeojsonâ
* removing â/lustre/shared/ucl/apps/R/R-4.2.0-OpenBLAS/lib64/R/library/geojsonâ
I've reinstalled RcppParallel using:
./R-4.2.0_single_package_install RcppParallel
and it has installed correctly as far as I can tell.
Protobuf protobuf/3.17.3/gnu-10.2.0 wasn't installed on Myriad. Installing with:
module -f unload compilers mpi gcc-libs
module load beta-modules
module load gcc-libs/10.2.0
module load compilers/gnu/10.2.0
module load python3/3.9-gnu-10.2.0
./protobuf-3.17.3_install
Redone geojsonio using:
./R-4.2.0_single_package_install geojsonio
Installing R MPI support packages:
./R-4.2.0_MPI_install
Module bundle updated and uploaded to Myriad.
Now ready to run some test jobs.
Basic R tests work and parallel run using the doMPI package works.
Modules required are:
module -f unload compilers mpi gcc-libs
module load beta-modules
module load r/r-4.2.0_bc-3.15
Ready to deploy on the other clusters.
Deploying on Kathleen next. there is a Jira story here:
https://ucldata.atlassian.net/browse/RCE-946
for this.
Basic R plus recommended packages built on Kathleen.
Job submitted on Kathleen as ccspapp to build the additional R packages and install Bioconductor:
job-ID prior name user state submit/start at queue slots ja-task-ID
-----------------------------------------------------------------------------------------------------------------
308885 2.70175 R-4.2.0-pa ccspapp qw 06/07/2022 12:37:46 80
Additional R packages build job has started running.
Additional R packages build job has finished - took 4 hours. Checking the output for errors now ...
Possible problems:
(tbb) Building TBB using bundled sources ...
make[1]: Entering directory `/lustre/scratch/ccspapp/R/work/RtmpyZ8lgc/R.INSTALL8751d16f4f4/RcppParallel/src/tbb/src'
OS: linux
arch=intel64
compiler=gcc
runtime=cc10.2.0_libc2.17_kernel3.10.0
tbb_build_prefix=linux_intel64_gcc_cc10.2.0_libc2.17_kernel3.10.0
work_dir=/lustre/scratch/ccspapp/R/work/RtmpyZ8lgc/R.INSTALL8751d16f4f4/RcppParallel/src/build/linux_intel64_gcc_cc10.2.0_libc2.17_kernel3.10.0_release
make[1]: Leaving directory `/lustre/scratch/ccspapp/R/work/RtmpyZ8lgc/R.INSTALL8751d16f4f4/RcppParallel/src/tbb/src'
(tbb) TBB compilation finished successfully.
g++ -std=gnu++14 -I"/shared/ucl/apps/R/R-4.2.0-OpenBLAS/lib64/R/include" -DNDEBUG -I../inst/include -I/usr/local/include -std=gnu++11 -DRCPP_PARALLEL_USE_TBB=1 -DTBB_SUPPRESS_DEPRECATED_MESSAGES=1 -fpic -g -O2 -c init.cpp -o init.o
g++ -std=gnu++14 -I"/shared/ucl/apps/R/R-4.2.0-OpenBLAS/lib64/R/include" -DNDEBUG -I../inst/include -I/usr/local/include -std=gnu++11 -DRCPP_PARALLEL_USE_TBB=1 -DTBB_SUPPRESS_DEPRECATED_MESSAGES=1 -fpic -g -O2 -c options.cpp -o options.o
g++ -std=gnu++14 -shared -L/shared/ucl/apps/R/R-4.2.0-OpenBLAS/lib64/R/lib -L/usr/local/lib64 -o RcppParallel.so init.o options.o -L/shared/ucl/apps/curl/7.47.1/gnu-4.9.2/lib -L/shared/ucl/apps/R/R-4.2.0-OpenBLAS/lib64/R/lib -lR
installing via 'install.libs.R' to /lustre/shared/ucl/apps/R/R-4.2.0-OpenBLAS/lib64/R/library/00LOCK-RcppParallel/00new/RcppParallel
** R
** inst
** byte-compile and prepare package for lazy loading
** help
*** installing help indices
** building package indices
** testing if installed package can be loaded from temporary location
** checking absolute paths in shared objects and dynamic libraries
readelf: /lustre/shared/ucl/apps/R/R-4.2.0-OpenBLAS/lib64/R/library/00LOCK-RcppParallel/00new/RcppParallel/lib/libtbb.so: Error: /lustre/shared/ucl/apps/R/R-4.2.0-OpenBLAS/lib64/R/library/00LOCK-RcppParallel/00new/RcppParallel/lib/libtbb.so: Failed to read file header
readelf: /lustre/shared/ucl/apps/R/R-4.2.0-OpenBLAS/lib64/R/library/00LOCK-RcppParallel/00new/RcppParallel/lib/libtbbmalloc.so: Error: /lustre/shared/ucl/apps/R/R-4.2.0-OpenBLAS/lib64/R/library/00LOCK-RcppParallel/00new/RcppParallel/lib/libtbbmalloc.so: Failed to read file header
readelf: /lustre/shared/ucl/apps/R/R-4.2.0-OpenBLAS/lib64/R/library/00LOCK-RcppParallel/00new/RcppParallel/lib/libtbbmalloc_proxy.so: Error: /lustre/shared/ucl/apps/R/R-4.2.0-OpenBLAS/lib64/R/library/00LOCK-RcppParallel/00new/RcppParallel/lib/libtbbmalloc_proxy.so: Failed to read file header
** testing if installed package can be loaded from final location
** testing if installed package keeps a record of temporary installation path
* DONE (RcppParallel)
but RcppParallel appears to have installed. I may do a single package install again to confirm.
Everything seems to be OK. Installing R MPI support packages:
module -f unload compilers mpi gcc-libs
module load beta-modules
cd /shared/ucl/apps/build_scripts/
./R-4.2.0_MPI_install
Done. Now to run some tests.
I've submitted a 2 node test job using the doMPI R package:
job-ID prior name user state submit/start at queue slots ja-task-ID
-----------------------------------------------------------------------------------------------------------------
309153 2.67273 doMPI-Ex1- ccaabaa qw 06/08/2022 18:31:42 80
Test job failed because it couldn't load the Rmpi package. I need to check why this didn't install correctly in the previous step.
I've found the problem. The installer couldn't run a test MPI script so didn't install Rmpi. I'm now working out how to fix it.
I've now got Rmpi to install by setting OpenMPI parameters for single node MPI.
I will now re-sbmit the 2 node test job using the doMPI R package.
The doMPI test job has worked.
I'm now trying a snow test job again over 2 nodes.
The snow test job has failed:
GERun: mpirun RMPISNOW
Loading required package: utils
--------------------------------------------------------------------------
Primary job terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
Attaching package: ârlecuyerâ
The following object is masked from âpackage:Rmpiâ:
.onUnload
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
Process name: [[39576,1],46]
Exit code: 1
--------------------------------------------------------------------------
Investigating ...