qcmaquis
qcmaquis copied to clipboard
Make cannot complete
Hi,
I tried to follow https://github.com/qcscine/qcmaquis
cmake -DLINALG=MKL -DBLAS_LAPACK_SELECTOR=mkl ../
then
make
I got
1071 | Call GA_Brdcst(MT_DBL, [res], storage_size(res)/8, 0)
| 1
Error: Symbol ‘mt_dbl’ at (1) has no IMPLICIT type
/mnt/home/user/Research/qcmaquis/dmrg/lib/interfaces/openmolcas/qcmaquis_interface.f90:792:27:
792 | call GA_Brdcst(MT_DBL, [dmrg_energy%dmrg], storage_size(dmrg_energy%num_sweeps)/8, 0)
| 1
Error: Symbol ‘mt_dbl’ at (1) has no IMPLICIT type
/mnt/home/user/Research/qcmaquis/dmrg/lib/interfaces/openmolcas/qcmaquis_interface.f90:797:27:
797 | call GA_Brdcst(MT_INT, dmrg_energy%num_sweeps, size(dmrg_energy%num_sweeps)*storage_size(dmrg_energy%num_sweeps)/8, 0)
| 1
Error: Symbol ‘mt_int’ at (1) has no IMPLICIT type
/mnt/home/user/Research/qcmaquis/dmrg/lib/interfaces/openmolcas/qcmaquis_interface.f90:680:30:
680 | call GA_Brdcst(MT_BYTE, fiedler_order_str, len_fiedler_str, 0)
| 1
Error: Symbol ‘mt_byte’ at (1) has no IMPLICIT type
/mnt/home/user/Research/qcmaquis/dmrg/lib/interfaces/openmolcas/qcmaquis_interface.f90:679:29:
679 | call GA_Brdcst(MT_INT, [len_fiedler_str], storage_size(len_fiedler_str)/8,0)
| 1
Error: Symbol ‘mt_int’ at (1) has no IMPLICIT type
make[2]: *** [lib/interfaces/openmolcas/CMakeFiles/qcmaquis-driver.dir/qcmaquis_interface.f90.o] Error 1
make[1]: *** [lib/interfaces/openmolcas/CMakeFiles/qcmaquis-driver.dir/all] Error 2
make: *** [all] Error 2
How to proceed? Thank you!
Hi,
I tried to proceed,
(i) after loaded modules GSL 2.6, boost 1.74.0, HDF5 1.12.0 cmake 3.18.4 gcc 9.3.0 if I cmake -DLINALG=MKL ../
then make
, I got
[100%] Built target dmrg
make[1]: Leaving directory `/mnt/ufs18/home-179/....../Research/qcmaquis-new-3/qcmaquis/dmrg/build'
/mnt/ufs18/home-179/....../Research/cmake/cmake-3.24.0-linux-x86_64/bin/cmake -E cmake_progress_start /mnt/home/....../Research/qcmaquis-new-3/qcmaquis/dmrg/build/CMakeFiles 0
looks like it works!
(ii)
if I would like to just use qcmaquis within openmolcas, can I directly follow https://gitlab.com/Molcas/OpenMolcas
and only modify the cmake step (step 3) with -DDMRG=ON -DNEVPT2=ON
without download qcmaquis? Or I need to download and compile both?
by cmake -DDMRG=ON -DNEVPT2=ON ../OpenMolcas
and make
in open molcas,
I got
CMake Warning at /mnt/home/.../Research/cmake/cmake-3.24.0-linux-x86_64/share/cmake-3.24/Modules/FindHDF5.cmake:745 (message):
HDF5 found for language C is not parallel but previously found language is
parallel.
Call Stack (most recent call first):
CMakeLists.txt:221 (find_package)
CMake Warning at alps/CMakeLists.txt:49 (MESSAGE):
parallel(MPI) hdf5 is detected. We will compile but ALPS does not use
parallel HDF5. The standard version is preferred.
CMake Error at alps/CMakeLists.txt:51 (MESSAGE):
parallel(MPI) hdf5 needs MPI. Enable MPI or install serial HDF5 libraries.
but I have loaded
HDF5/1.12.0 OpenMPI/4.0.3
I would not recommend using MPI
in OpenMolcas when compiling with QCMaquis.
that is, do NOT load
OpenMPI/4.0.3
Thank you so much for the suggestions. I am not sure if the HDF5 installed in the cluster is parallel version. Perhaps I need to contact the administrator to check it. Most versions in the cluster requires load OpenMPI
and GCC
. HDF5/1.10.2
only needs PGI/18.4-GCC-6.4.0-2.28
, and it leads to the same
-- HDF5: Using hdf5 compiler wrapper to determine C configuration
-- Found HDF5: /opt/software/HDF5/1.12.0-iimpi-2020a/lib/libhdf5.so;/opt/softwar e/Szip/2.1.1-GCCcore-9.3.0/lib/libsz.so;/opt/software/zlib/1.2.11-GCCcore-9.3.0/ lib/libz.so;/usr/lib64/libdl.so;/usr/lib64/libm.so (found version "1.12.0")
-- Found Boost: /opt/software/Boost/1.67.0/include (found suitable version "1.67 .0", minimum required is "1.56") found components: program_options filesystem s ystem serialization thread chrono date_time atomic
CMake Warning at alps/CMakeLists.txt:49 (MESSAGE):
parallel(MPI) hdf5 is detected. We will compile but ALPS does not use
parallel HDF5. The standard version is preferred.
CMake Error at alps/CMakeLists.txt:51 (MESSAGE):
parallel(MPI) hdf5 needs MPI. Enable MPI or install serial HDF5 libraries.
-- Configuring incomplete, errors occurred!
or I need to compile and install HDF5 myself.
I would suggest to try to compile locally the HDF5 library by disabling the MPI option and link against that version.
I routinely use the version 1.10.2
of HDF5, so the version itself should not be a problem.
I tried a couple of efforts, e.g., locally install hdf5 (compile without --enable-parallel
as in https://github.com/mokus0/hdf5/blob/master/release_docs/INSTALL) or binary version, using boost serial version, not use OpenMPI,
always got something like :(
CMake Warning at /opt/software/CMake/3.20.1-GCCcore-9.3.0/share/cmake-3.20/Modules/FindHDF5.cmake:742 (message):
HDF5 found for language C is not parallel but previously found language is
parallel.
Call Stack (most recent call first):
CMakeLists.txt:221 (find_package)
CMake Warning at alps/CMakeLists.txt:49 (MESSAGE):
parallel(MPI) hdf5 is detected. We will compile but ALPS does not use
parallel HDF5. The standard version is preferred.
CMake Error at alps/CMakeLists.txt:51 (MESSAGE):
parallel(MPI) hdf5 needs MPI. Enable MPI or install serial HDF5 libraries.
-- Configuring incomplete, errors occurred!
See also "/mnt/home/....../Research/openmolcas-2/build/External/qcmaquis/src/qcmaquis-build/CMakeFiles/CMakeOutput.log".
See also "/mnt/home/....../Research/openmolcas-2/build/External/qcmaquis/src/qcmaquis-build/CMakeFiles/CMakeError.log".
make[2]: *** [External/qcmaquis/src/qcmaquis-stamp/qcmaquis-configure] Error 1
make[1]: *** [External/qcmaquis/CMakeFiles/qcmaquis.dir/all] Error 2
Which version of the hdf5
library are you compiling? If you provide me with this info, I can try to find a compile configuration for HDF5 which works on my local machine
In addition, did you verify that the environment variable
HDF5_ROOT
points to the HDF5 installation that you would like to use? It seems like cmake
always picks up the parallel HDF5 installation you loaded on the cluster.
Try
printenv
and check for the value of HDF5_ROOT
.
Note that you can also run the cmake step of OpenMolcas with
HDF5_ROOT=XXX cmake -DDMRG=ON -DNEVPT2=ON ../OpenMolcas
where HDF5_ROOT=XXX
points to your customized HDF5 installation
I tried a couple of ways:
install hdf 1.12.0
locally,
module load serial version of hdf, HDF5/1.8.17-serial
on the cluster
install binary version of hdf 1.14.0
.
All of these lead to similar error message in the make
step, after cmake -DDMRG=ON -DNEVPT2=ON ../OpenMolcas
.
in .bashrc
, I set (user name replaced by ...
)
export HDF5_ROOT=/mnt/home/.../Research/hdf5-1.12.0
export PATH=$HDF5_ROOT/bin:$PATH
export CPATH=$HDF5_ROOT/include:$CPATH
export LD_LIBRARY_PATH=$HDF5_ROOT/lib:$LD_LIBRARY_PATH
printenv
gives me
...
...
QTINC=/usr/lib64/qt-3.3/include
LMOD_VERSION=8.5.8.ICER
SSH_TTY=/dev/pts/93
QT_GRAPHICSSYSTEM_CHECKED=1
USER=...
LD_LIBRARY_PATH=/mnt/home/.../Research/hdf5-1.12.0/lib:/mnt/home/.../orca
LMOD_sys=Linux
LS_COLORS=rs=0:di=38;5;27:ln=38;5;51:mh=44;38;5;15:pi=40;38;5;11:so=38;5;13:do=38;5;5:bd=48;5;232;38;5;11:cd=48;5;232;38;5;3:or=48;5;232;38;5;9:mi=05;48;5;232;38;5;15:su=48;5;196;38;5;15:sg=48;5;11;38;5;16:ca=48;5;196;38;5;226:tw=48;5;10;38;5;16:ow=48;5;10;38;5;21:st=48;5;21;38;5;15:ex=38;5;34:*.tar=38;5;9:*.tgz=38;5;9:*.arc=38;5;9:*.arj=38;5;9:*.taz=38;5;9:*.lha=38;5;9:*.lz4=38;5;9:*.lzh=38;5;9:*.lzma=38;5;9:*.tlz=38;5;9:*.txz=38;5;9:*.tzo=38;5;9:*.t7z=38;5;9:*.zip=38;5;9:*.z=38;5;9:*.Z=38;5;9:*.dz=38;5;9:*.gz=38;5;9:*.lrz=38;5;9:*.lz=38;5;9:*.lzo=38;5;9:*.xz=38;5;9:*.bz2=38;5;9:*.bz=38;5;9:*.tbz=38;5;9:*.tbz2=38;5;9:*.tz=38;5;9:*.deb=38;5;9:*.rpm=38;5;9:*.jar=38;5;9:*.war=38;5;9:*.ear=38;5;9:*.sar=38;5;9:*.rar=38;5;9:*.alz=38;5;9:*.ace=38;5;9:*.zoo=38;5;9:*.cpio=38;5;9:*.7z=38;5;9:*.rz=38;5;9:*.cab=38;5;9:*.jpg=38;5;13:*.jpeg=38;5;13:*.gif=38;5;13:*.bmp=38;5;13:*.pbm=38;5;13:*.pgm=38;5;13:*.ppm=38;5;13:*.tga=38;5;13:*.xbm=38;5;13:*.xpm=38;5;13:*.tif=38;5;13:*.tiff=38;5;13:*.png=38;5;13:*.svg=38;5;13:*.svgz=38;5;13:*.mng=38;5;13:*.pcx=38;5;13:*.mov=38;5;13:*.mpg=38;5;13:*.mpeg=38;5;13:*.m2v=38;5;13:*.mkv=38;5;13:*.webm=38;5;13:*.ogm=38;5;13:*.mp4=38;5;13:*.m4v=38;5;13:*.mp4v=38;5;13:*.vob=38;5;13:*.qt=38;5;13:*.nuv=38;5;13:*.wmv=38;5;13:*.asf=38;5;13:*.rm=38;5;13:*.rmvb=38;5;13:*.flc=38;5;13:*.avi=38;5;13:*.fli=38;5;13:*.flv=38;5;13:*.gl=38;5;13:*.dl=38;5;13:*.xcf=38;5;13:*.xwd=38;5;13:*.yuv=38;5;13:*.cgm=38;5;13:*.emf=38;5;13:*.axv=38;5;13:*.anx=38;5;13:*.ogv=38;5;13:*.ogx=38;5;13:*.aac=38;5;45:*.au=38;5;45:*.flac=38;5;45:*.mid=38;5;45:*.midi=38;5;45:*.mka=38;5;45:*.mp3=38;5;45:*.mpc=38;5;45:*.ogg=38;5;45:*.ra=38;5;45:*.wav=38;5;45:*.axa=38;5;45:*.oga=38;5;45:*.spx=38;5;45:*.xspf=38;5;45:
OMPI_MCA_btl_openib_btls_per_lid=8
CPATH=/mnt/home/.../Research/hdf5-1.12.0/include
XTBPATH=/mnt/home/.../Research/xtb-6.5.1/share/xtb
_ModuleTable001_=X01vZHVsZVRhYmxlXz17WyJNVHZlcnNpb24iXT0zLFsiY19yZWJ1aWxkVGltZSJdPWZhbHNlLFsiY19zaG9ydFRpbWUiXT1mYWxzZSxkZXB0aFQ9e30sZmFtaWx5PXt9LG1UPXt9LG1wYXRoQT17Ii9vcHQvc29mdHdhcmUvaHBjYy9tb2R1bGVzIiwiL29wdC9tb2R1bGVzL0NvcmUiLH0sWyJzeXN0ZW1CYXNlTVBBVEgiXT0iL29wdC9tb2R1bGVzL0NvcmUiLH0=
MAIL=/var/spool/mail/...
PATH=/mnt/home/.../Research/hdf5-1.12.0/bin:/mnt/home/.../hdf/HDF5-1.14.0-Linux/HDF_Group/HDF5/1.14.0/bin:/mnt/home/.../orca:/usr/lib64/qt-3.3/bin:/opt/software/core/lua/lua/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/usr/local/hpcc/bin:/usr/lpp/mmfs/bin:/opt/ibutils/bin:/opt/puppetlabs/bin:/mnt/home/.../Research/xtb-6.5.1/bin:/mnt/home/.../Research/cmake/cmake-3.24.0-linux-x86_64/bin
MKL_NUM_THREADS=1
PWD=/mnt/home/.../Research/openmolcas-2/build
HDF5_ROOT=/mnt/home/.../Research/hdf5-1.12.0
LUA_PATH=/opt/software/core/lua/lua/share/lua/5.1/?.lua;/opt/software/core/lua/lua/share/lua/5.1/?/init.lua;;
LANG=en_US.UTF-8
MODULEPATH=/opt/software/hpcc/modules:/opt/modules/Core
LUA_CPATH=/opt/software/core/lua/lua/lib/lua/5.1/?.so;;
_ModuleTable_Sz_=1
KDEDIRS=/usr
LMOD_CMD=/usr/local/lmod/lmod/libexec/lmod
HISTCONTROL=ignoredups
__LMOD_SET_FPATH=1
SHLVL=1
HOME=/mnt/home/...
__LMOD_REF_COUNT_PATH=/mnt/home/.../Research/hdf5-1.12.0/bin:1;/mnt/home/.../hdf/HDF5-1.14.0-Linux/HDF_Group/HDF5/1.14.0/bin:2;/mnt/home/.../orca:1;/usr/lib64/qt-3.3/bin:1;/opt/software/core/lua/lua/bin:1;/usr/local/bin:1;/usr/bin:1;/usr/local/sbin:1;/usr/sbin:1;/usr/local/hpcc/bin:2;/usr/lpp/mmfs/bin:1;/opt/ibutils/bin:1;/opt/puppetlabs/bin:1;/mnt/home/.../Research/xtb-6.5.1/bin:2;/mnt/home/.../Research/cmake/cmake-3.24.0-linux-x86_64/bin:2
__LMOD_REF_COUNT_CPATH=/mnt/home/.../Research/hdf5-1.12.0/include:1
_ModuleTable002_=cHQvbW9kdWxlcy9Db21waWxlci9HQ0Njb3JlLzYuNC4wL0NNYWtlLzMuMTEuMS5sdWEiLFsiZnVsbE5hbWUiXT0iQ01ha2UvMy4xMS4xIixbImxvYWRPcmRlciJdPS0yMSxwcm9wVD17fSxbInN0YWNrRGVwdGgiXT0wLFsic3RhdHVzIl09ImluYWN0aXZlIixbInVzZXJOYW1lIl09IkNNYWtlLzMuMTEuMSIsWyJ3ViJdPSIwMDAwMDAwMDMuMDAwMDAwMDExLjAwMDAwMDAwMS4qemZpbmFsIix9LEZGVFc9e1siZm4iXT0iL29wdC9tb2R1bGVzL01QSS9pbnRlbC8yMDE5LjMuMTk5LUdDQy04LjMuMC0yLjMyL2ltcGkvMjAxOS4zLjE5OS9GRlRXLzMuMy44Lmx1YSIsWyJmdWxsTmFtZSJdPSJGRlRXLzMuMy44IixbImxvYWRPcmRlciJdPTE3LHByb3BUPXt9LFsic3RhY2tEZXB0aCJd
BASH_ENV=/usr/local/lmod/lmod/init/bash
LOGNAME=...
QTLIB=/usr/lib64/qt-3.3/lib
CVS_RSH=ssh
...
By HDF5_ROOT=/mnt/home/.../Research/hdf5-1.12.0 cmake -DDMRG=ON -DNEVPT2=ON ../OpenMolcas
, (after make clean
, module purge
)
I again got
loading initial cache file /mnt/home/.../Research/openmolcas-2/build/External/qcmaquis/tmp/qcmaquis-cache-Release.cmake
-- Enabled symmetries: TwoU1PG;SU2U1PG
GSL_DEFINITIONS=
GSL_INCLUDE_DIRS=/opt/software/GSL/2.6-GCCcore-9.3.0/include
GSL_CFLAGS=-I/opt/software/GSL/2.6-GCCcore-9.3.0/include
-- Using GSL from
-- FindGSL: Found both GSL headers and library
-- Lapack include:
-- Lapack lib dirs:
-- Lapack libs: /mnt/home/.../Research/openmolcas-2/build/lib/liblapack.a;/mnt/home/.../Research/openmolcas-2/build/lib/libblas.a;gfortran;m;gcc_s;gcc;quadmath;m;gcc_s;gcc;c;gcc_s;gcc
-- HDF5 C compiler wrapper is unable to compile a minimal HDF5 program.
CMake Warning at /opt/software/CMake/3.20.1-GCCcore-9.3.0/share/cmake-3.20/Modules/FindHDF5.cmake:742 (message):
HDF5 found for language C is not parallel but previously found language is
parallel.
Call Stack (most recent call first):
CMakeLists.txt:221 (find_package)
-- Found HDF5: /opt/software/HDF5/1.12.0-gompi-2020a/lib/libhdf5.so (found version "1.12.0")
CMake Warning at alps/CMakeLists.txt:49 (MESSAGE):
parallel(MPI) hdf5 is detected. We will compile but ALPS does not use
parallel HDF5. The standard version is preferred.
CMake Error at alps/CMakeLists.txt:51 (MESSAGE):
parallel(MPI) hdf5 needs MPI. Enable MPI or install serial HDF5 libraries.
From the compilation output, it seems that hdf5 library which is used is still the module one (/opt/software/HDF5/1.12.0-gompi-2020a/lib/libhdf5.so
), and not the one installed in /mnt/home/.../Research/hdf5-1.12.0
. Maybe the hdf5 module is still loaded and, therefore, the code still looks there and not in the local installation?
I contacted a staff of the cluster system I am working in, and the qcmaquis with openmolcas is installed. I didn't ask how they managed to get it work, perhaps related to the cluster setting. (there used to be some issue with other modules)
https://gitlab.com/Molcas/OpenMolcas/-/blob/master/test/qcmaquis/011.input and https://gitlab.com/Molcas/OpenMolcas/-/blob/master/test/qcmaquis/012.input can be done with "Happy landing", but https://gitlab.com/Molcas/OpenMolcas/-/blob/master/test/qcmaquis/001.input ends with
File "qcmaquis" does not exist
>>> IF (-FILE TEST_QCMAQUIS)
>>> EXIT _RC_NOT_AVAILABLE_
.##################################.
.# Requested module not available #.
.#################################
notice 001.input is about 4 years ago. I am very new to openmolcas and qcmaquis syntax. May I know is 001.input normal?
Sorry for the late reply. The text that you pasted is the final part of what you obtain as output by running pymolcas 001.input
? If yes, then there is still a problem -- you should get something as:
Note: The following floating-point exceptions are signalling: IEEE_UNDERFLOW_FLAG IEEE_DENORMAL
--- Stop Module: dmrgscf at Sun Apr 23 17:26:45 2023 /rc=_RC_ALL_IS_WELL_ ---
*** files: 001.rasscf.molden 001.RasOrb 001.RasOrb.1 001.dmrgscf.h5 001.SpdOrb.1 xmldump
saved to directory /home/cds/abaiardi/local/src/OpenMolcas/build
--- Module dmrgscf spent 14 seconds ---
Happy landing!
Timing: Wall=15.85 User=17.88 System=0.59
Can you maybe copy the complete output?
Thanks for your reply (also sorry for my late response). Here is the complete output (userid edited). Perhaps I should modify 011.input
to other molecule/method/basis set, than on top of 001.input
OPE
OPE NMOL CASOP ENMOLC A SO
OPE NMOLC AS OP EN MO LC AS OP
OPENM OL CA SO PE NM OL CA SOP EN
OP EN MO LC AS OP ENMOL CASO PENMOL
OP EN MO LC AS OP EN MO LC ASO
OP E NMOL C AS OP EN MO LC AS OP
OP E NMO LC AS OPEN MO LCASOP EN M
O PEN MO LCA SO
OPE NMO L CAS OP
OPENMOL CASOP ENMOL CASOPE
OPENMOLCA SOPENMOLCASOPEN MOLCASOPE
OPENMOLCAS OP EN MOL CAS
OPENMOLCAS OP ENM O LCA
OPENMOLCAS OPEN MOLCASO P E NMO
OPENMOLCAS OP E N MOL
OPENMOLCA SO PENM O L CAS OPEN MO LCAS
OPENMOLCA SOP ENM O L CAS OP EN MOLC AS O
OPENMOLCA SOPE NM O LCA S OP EN MO
OPENMOLC AS O PEN M OL CA SOPE
OPENMO LCASOPE NMOL C ASO P ENMOLC AS
OPE NMO LCA SO P E NM OL CA SO PE N MO
OPENMOLCA SOPE NMO LCAS O P ENMO
OPENMOLCASOPENMOLCASOPENMOLCASOPENMOLCA
OPENMOLCASOPENMOLCASOPENMOLCASOPE
OPENMOLCASOPENMOLCASOPENM
OPENMOLCASOPENMOLCA version: v22.10 ?
OPENMOLCASO
OPE tag:
OpenMolcas is free software; you can redistribute it and/or modify it
under the terms of the GNU Lesser General Public License version 2.1.
OpenMolcas is distributed in the hope that it will be useful, but it
is provided "as is" and without any express or implied warranties.
For more details see the full text of the license in the file
LICENSE or in <http://www.gnu.org/licenses/>.
Copyright (C) The OpenMolcas Authors
For the author list and the recommended citation,
consult the file CONTRIBUTORS.md
*************************************************
* pymolcas version py2.23 *
* build d53e6f8e18683222a6e02cb84ce98eb3 *
* (after the EMIL interpreter by V. Veryazov) *
*************************************************
configuration info
------------------
Host name: dev-intel16-k80 (Linux)
C Compiler ID: Intel
C flags: -O2 -xSSE2 -ftz -fp-speculation=safe -fp-model source -std=gnu99 -gcc-sys -qopenmp
Fortran Compiler ID: Intel
Fortran flags: -O2 -xSSE2 -ftz -fp-speculation=safe -fp-model source -fpp -i8 -heap-arrays -qopenmp
Definitions: _MOLCAS_;_I8_;_LINUX_
Parallel: OFF (GA=ON)
-----------------------------------------------------------------------
|
| Project: 001
| Submitted from: /mnt/ufs18/home-179/.../Research
| Scratch area: /tmp/001
| Save outputs to: /mnt/ufs18/home-179/.../Research
| Molcas: /opt/software/OpenMolcas/22.10-intel-2022a
|
| Scratch area is NOT empty
|
| MOLCAS_DRIVER = /opt/software/OpenMolcas/22.10-intel-2022a/pymolcas
| MOLCAS_NPROCS = 1
| MOLCAS_SOURCE = /opt/software/OpenMolcas/22.10-intel-2022a
| MOLCAS_STRUCTURE = 0
|
-----------------------------------------------------------------------
++ --------- Input file ---------
>>> RM -FORCE TEST_QCMAQUIS
>>> IF ($MOLCAS_DRIVER == UNKNOWN_VARIABLE)
>>> EXPORT MOLCAS_DRIVER = molcas
>>> END IF
>>> SHELL $MOLCAS_DRIVER have_feature qcmaquis || touch TEST_QCMAQUIS
>>> IF (-FILE TEST_QCMAQUIS)
>>> EXIT 36
>>> END IF
&GATEWAY
coord
2
Angstrom
N 0.000000 0.000000 -0.54880
N 0.000000 0.000000 0.54880
basis=cc-pvdz
&SEWARD
&SCF
&DMRGSCF
ActiveSpaceOptimizer=QCMaquis
DMRGSettings
conv_thresh = 1e-4
truncation_final = 1e-5
ietl_jcd_tol = 1e-6
nsweeps = 4
max_bond_dimension = 100
EndDMRGSettings
OOptimizationSettings
inactive = 2 0 0 0 2 0 0 0
RAS2 = 1 1 1 0 1 1 1 0
ITER = 15,100
SOCC = 2,2,2,0,0,0
LINEAR
EndOOptimizationSettings
-- ----------------------------------
>>> RM -FORCE TEST_QCMAQUIS
>>> IF ($MOLCAS_DRIVER == UNKNOWN_VARIABLE)
(Skipped)
>>> SHELL /opt/software/OpenMolcas/22.10-intel-2022a/pymolcas have_feature qcmaquis || touch TEST_QCMAQUIS
File "qcmaquis" does not exist
>>> IF (-FILE TEST_QCMAQUIS)
>>> EXIT _RC_NOT_AVAILABLE_
.##################################.
.# Requested module not available #.
.##################################.
I believe that the problem is not related to the specific test-case -- it actually seems that the DMRG module has not been found. Could you double-check if the DMRG
CMake option has been set to ON
before compiling OpenMolcas?
Sorry for the delayed reply. I need to reach out the cluster staff to see how they complied. At this moment, I am more focus on another problem posted on the issues (about max_bond_dimension
).