rcps-buildscripts icon indicating copy to clipboard operation
rcps-buildscripts copied to clipboard

Install Request: GROMACS 2023 (latest update) including CPU, GPU and + Plumed versions [IN05756873] [IN05807314][IN05844418]

Open balston opened this issue 2 years ago • 8 comments

Install the latest 2022 release of GROMACS initially on Kathleen.

https://manual.gromacs.org/documentation/2022/download.html

May need to wait for Spack to be available centrally to install.

balston avatar Jan 16 '23 11:01 balston

Updated to install GROMACS 2023 (released on 6th February):

https://manual.gromacs.org/current/download.html

balston avatar Feb 07 '23 14:02 balston

A GPU + Plumed version has also been requested for Young.

balston avatar Feb 28 '23 11:02 balston

Comment from user about the GPU + Plumed version:

Also just another note on the plumed install that you may not be aware of but I plan on using a specific module on plumed (OPES) that is not installed by default so when installing it would be really helpful if '--enable-modules=all’ could be added to then ‘./configure’ command so it is available for me to use OPES. More information on this is on this link:

https://www.plumed.org/doc-v2.7/user-doc/html/_o_p_e_s.html

On the site the command says to include it as '--enable-modules=opes’ but I have found when installing plumed that using “all” means you save multiple builds for specific modules.

balston avatar Mar 02 '23 09:03 balston

IN:05940347 Another request for GPU GROMACS + plumed, this time on Myriad.

IN:06032490 CPU GROMACS on Kathleen.

heatherkellyucl avatar Apr 24 '23 12:04 heatherkellyucl

If anyone else needs a GROMACS+GPU+plumed urgently, I suggested this and it worked:

We're currently replacing our buildscripts system with Spack and so we'd prefer to do the new installs once we get that up and running, but this may take some time before it is ready.

You could try building it in your space if you wanted to try that now.

To start with, load these modules:

module unload -f compilers mpi gcc-libs
module load beta-modules
module load gcc-libs/10.2.0
module load compilers/gnu/10.2.0
module load cuda/11.3.1/gnu-10.2.0
module load numactl/2.0.12
module load binutils/2.36.1/gnu-10.2.0
module load ucx/1.9.0/gnu-10.2.0
module load mpi/openmpi/4.0.5/gnu-10.2.0
module load python3/3.9-gnu-10.2.0
module load libmatheval
module load flex
module load plumed/2.7.2/gnu-10.2.0

In here, you can see the buildscripts for all the ways we currently build GROMACS: https://github.com/UCL-RITS/rcps-buildscripts/search?o=desc&q=gromacs&s=indexed. You're going to need to combine gromacs-2021.3-plumed_install and gromacs-2021.5_install_gpu to get plumed and GPU support.

You should wget the version of the source code that you want. I will use 2021.5 in this example because it is the most recent one we currently have on the cluster. Then untar it:

wget ftp://ftp.gromacs.org/gromacs/gromacs-2021.5.tar.gz
tar -xvf gromacs-2021.5.tar.gz
cd gromacs-2021.5

# Before building anything, you need to patch the gromacs source code using plumed:
plumed patch -p -e gromacs-2021.5 --shared

# now make a build directory in there and we are ready to build
mkdir build
cd build

# Only build the MPI versions of GROMACS with PLUMED
# Replace $install_prefix below with the final install location you want it to use

# full gmx cuda mpi build for testing
echo "Building and installing full, MPI GPU, single precision GROMACS"
cmake .. \
    -DGMX_GPU=CUDA \
    -DCUDA_TOOLKIT_ROOT_DIR=$CUDA_PATH \
    -DGMX_CUDA_TARGET_SM="60;61;62;70;72;80;86" \
    -DGMX_MPI=ON \
    -DGMX_BUILD_OWN_FFTW=ON \
    -DGMX_X11=ON \
    -DCMAKE_INSTALL_PREFIX=$install_prefix \
    -DGMX_DEFAULT_SUFFIX=OFF \
    -DGMX_BINARY_SUFFIX=_mpi_cuda \
    -DGMX_LIBS_SUFFIX=_mpi_cuda \
    $CMAKE_FLAGS
make
make install
rm -rf  *

echo "Building and installing mdrun-only, MPI GPU, single precision GROMACS"
cmake .. \
    -DGMX_GPU=CUDA \
    -DCUDA_TOOLKIT_ROOT_DIR=$CUDA_PATH \
    -DGMX_CUDA_TARGET_SM="60;61;62;70;72;80;86" \
    -DGMX_MPI=ON \
    -DGMX_BUILD_MDRUN_ONLY=ON \
    -DGMX_FFT_LIBRARY=fftw3 \
    -DGMX_BUILD_OWN_FFTW=ON \
    -DGMX_X11=ON \
    -DCMAKE_INSTALL_PREFIX=$install_prefix \
    -DGMX_DEFAULT_SUFFIX=OFF \
    -DGMX_BINARY_SUFFIX=_mpi_cuda \
    -DGMX_LIBS_SUFFIX=_mpi_cuda \
    $CMAKE_FLAGS
make
make install
rm -rf *

You should end up with the executables gmx_cuda, gmx_mpi_cuda, mdrun_mpi_cuda in the bin directory in wherever you set $install_prefix to be. This may all work with newer GROMACS versions too, but they might have changed some options.

heatherkellyucl avatar Apr 25 '23 09:04 heatherkellyucl

Note: plumed currently not quite ready for GROMACS 2023.1 https://github.com/plumed/plumed2/issues/960 Spack doesn't currently try to use plumed 2.9, provides 2022.5 with plumed 2.8.2 and 2023.1 with no plumed.

heatherkellyucl avatar Jul 20 '23 15:07 heatherkellyucl

IN:06479556 GROMACS 2023 on Young

kaibinary avatar Feb 15 '24 15:02 kaibinary

IN:06721906 GROMACS 2024.2

Request #1252505 GROMACS 2024.2 +cp2k +double variant required.

kaibinary avatar Aug 12 '24 10:08 kaibinary