dedalus icon indicating copy to clipboard operation
dedalus copied to clipboard

Shell Convection IVP Example does not run with mpiexec (new conda installation)

Open ntlewis opened this issue 6 months ago • 3 comments

Hi All,

Recently I tried to update my dedalus installation. To do this, I just ran conda install -c conda-forge dedalus within an empty environment.

With this installation, I can no longer run the shell convection example in parallel (i.e., with mpiexec). In serial, the code runs fine.

The error I get is included at the bottom. It comes from the rvec*lift(tau_u1) part of the first order reduction. I imagine that something has gone wrong with basis.radial_basis? The error itself is a TypeError that seems to come from mpi4py (specifically: line 47, in mpi4py.MPI.getarray: TypeError: 'numpy.bool' object cannot be interpreted as an integer).

I get this issue with mpi4py>4 (I have tested 4.0.3, which is what conda initially installed, but the issue persists when I downgrade to 4.0.1). I tried to specify mpi4py=4.0.0 but conda told me this was incompatible with something. (for info, after downgrading to 4.0.1 the dedalus version is 3.0.3).

I have a previous installation with dedalus 3.0.2 and mpi4py 4.0.0 and everything works fine.

Would you guys be able to take a look at this, and let me know if there is anything I can do to fix?

Thanks,

Neil

Full conda installation info:

packages in environment at /cosma/home/dp015/dc-lewi5/miniforge3/envs/dedalus3:

Name Version Build Channel _libgcc_mutex 0.1 conda_forge conda-forge _openmp_mutex 4.5 2_gnu conda-forge attr 2.5.1 h166bdaf_1 conda-forge brotli 1.1.0 hb9d3cd8_3 conda-forge brotli-bin 1.1.0 hb9d3cd8_3 conda-forge bzip2 1.0.8 h4bc722e_7 conda-forge c-ares 1.34.5 hb9d3cd8_0 conda-forge ca-certificates 2025.6.15 hbd8a1cb_0 conda-forge cached-property 1.5.2 hd8ed1ab_1 conda-forge cached_property 1.5.2 pyha770c72_1 conda-forge colorama 0.4.6 pyhd8ed1ab_1 conda-forge contourpy 1.3.2 py313h33d0bda_0 conda-forge coverage 7.9.1 py313h8060acc_0 conda-forge cycler 0.12.1 pyhd8ed1ab_1 conda-forge dedalus 3.0.3 py313h07054c4_1 conda-forge dedalus-composer 0.1 pypi_0 pypi docopt 0.6.2 pyhd8ed1ab_2 conda-forge exceptiongroup 1.3.0 pyhd8ed1ab_0 conda-forge fftw 3.3.10 mpi_mpich_hbcf76dd_10 conda-forge fonttools 4.58.4 py313h8060acc_0 conda-forge freetype 2.13.3 ha770c72_1 conda-forge h5py 3.12.1 mpi_mpich_py313h33fceeb_3 conda-forge hdf5 1.14.4 mpi_mpich_h7f58efa_5 conda-forge icu 75.1 he02047a_0 conda-forge iniconfig 2.0.0 pyhd8ed1ab_1 conda-forge keyutils 1.6.1 h166bdaf_0 conda-forge kiwisolver 1.4.7 py313h33d0bda_0 conda-forge krb5 1.21.3 h659f571_0 conda-forge lcms2 2.17 h717163a_0 conda-forge ld_impl_linux-64 2.43 h1423503_5 conda-forge lerc 4.0.0 h0aef613_1 conda-forge libaec 1.1.4 h3f801dc_0 conda-forge libblas 3.9.0 31_h59b9bed_openblas conda-forge libbrotlicommon 1.1.0 hb9d3cd8_3 conda-forge libbrotlidec 1.1.0 hb9d3cd8_3 conda-forge libbrotlienc 1.1.0 hb9d3cd8_3 conda-forge libcap 2.75 h39aace5_0 conda-forge libcblas 3.9.0 31_he106b2a_openblas conda-forge libcbor 0.10.2 hcb278e6_0 conda-forge libcurl 8.14.1 h332b0f4_0 conda-forge libdeflate 1.24 h86f0d12_0 conda-forge libedit 3.1.20250104 pl5321h7949ede_0 conda-forge libev 4.33 hd590300_2 conda-forge libexpat 2.7.0 h5888daf_0 conda-forge libfabric 2.1.0 ha770c72_1 conda-forge libfabric1 2.1.0 hf45584d_1 conda-forge libffi 3.4.6 h2dba641_1 conda-forge libfido2 1.15.0 hdd1f21f_0 conda-forge libfreetype 2.13.3 ha770c72_1 conda-forge libfreetype6 2.13.3 h48d6fc4_1 conda-forge libgcc 15.1.0 h767d61c_2 conda-forge libgcc-ng 15.1.0 h69a702a_2 conda-forge libgcrypt-lib 1.11.1 hb9d3cd8_0 conda-forge libgfortran 15.1.0 h69a702a_2 conda-forge libgfortran-ng 15.1.0 h69a702a_2 conda-forge libgfortran5 15.1.0 hcea5267_2 conda-forge libgomp 15.1.0 h767d61c_2 conda-forge libgpg-error 1.55 h3f2d84a_0 conda-forge libhwloc 2.11.2 default_h0d58e46_1001 conda-forge libiconv 1.18 h4ce23a2_1 conda-forge libjpeg-turbo 3.1.0 hb9d3cd8_0 conda-forge liblapack 3.9.0 31_h7ac8fdf_openblas conda-forge liblzma 5.8.1 hb9d3cd8_2 conda-forge libmpdec 4.0.0 hb9d3cd8_0 conda-forge libnghttp2 1.64.0 h161d5f1_0 conda-forge libnl 3.11.0 hb9d3cd8_0 conda-forge libopenblas 0.3.29 pthreads_h94d23a6_0 conda-forge libpng 1.6.49 h943b412_0 conda-forge libsqlite 3.50.1 hee588c1_0 conda-forge libssh2 1.11.1 hcf80075_0 conda-forge libstdcxx 15.1.0 h8f9b012_2 conda-forge libstdcxx-ng 15.1.0 h4852527_2 conda-forge libsystemd0 257.6 h4e0b6ca_0 conda-forge libtiff 4.7.0 hf01ce69_5 conda-forge libudev1 257.6 hbe16f8c_0 conda-forge libuuid 2.38.1 h0b41bf4_0 conda-forge libwebp-base 1.5.0 h851e524_0 conda-forge libxcb 1.17.0 h8a09558_0 conda-forge libxcrypt 4.4.36 hd590300_1 conda-forge libxml2 2.13.8 h4bc477f_0 conda-forge libzlib 1.3.1 hb9d3cd8_2 conda-forge lz4-c 1.10.0 h5888daf_1 conda-forge matplotlib-base 3.10.3 py313h129903b_0 conda-forge mpi 1.0.1 mpich conda-forge mpi4py 4.0.1 py313h7246b6a_1 conda-forge mpich 4.3.0 h1a8bee6_100 conda-forge munkres 1.1.4 pyhd8ed1ab_1 conda-forge ncurses 6.5 h2d0b736_3 conda-forge nomkl 1.0 h5ca1d4c_0 conda-forge numexpr 2.10.2 py313h5f97788_100 conda-forge numpy 2.3.0 py313h17eae1a_0 conda-forge openjpeg 2.5.3 h5fbd93e_0 conda-forge openssh 10.0p1 hc830a30_0 conda-forge openssl 3.5.0 h7b32b05_1 conda-forge packaging 25.0 pyh29332c3_1 conda-forge pandas 2.3.0 py313ha87cce1_0 conda-forge pillow 11.2.1 py313h8db990d_0 conda-forge pip 25.1.1 pyh145f28c_0 conda-forge pluggy 1.6.0 pyhd8ed1ab_0 conda-forge pthread-stubs 0.4 hb9d3cd8_1002 conda-forge py 1.11.0 pyhd8ed1ab_1 conda-forge py-cpuinfo 9.0.0 pyhd8ed1ab_1 conda-forge pygments 2.19.1 pyhd8ed1ab_0 conda-forge pyparsing 3.2.3 pyhd8ed1ab_1 conda-forge pytest 8.4.0 pyhd8ed1ab_0 conda-forge pytest-benchmark 5.1.0 pyhd8ed1ab_2 conda-forge pytest-cov 6.2.1 pyhd8ed1ab_0 conda-forge pytest-parallel 0.1.1 pyhd8ed1ab_0 conda-forge python 3.13.5 hec9711d_102_cp313 conda-forge python-dateutil 2.9.0.post0 pyhff2d567_1 conda-forge python-tzdata 2025.2 pyhd8ed1ab_0 conda-forge python_abi 3.13 7_cp313 conda-forge pytz 2025.2 pyhd8ed1ab_0 conda-forge qhull 2020.2 h434a139_5 conda-forge rdma-core 57.0 h5888daf_0 conda-forge readline 8.2 h8c095d6_2 conda-forge scipy 1.15.2 py313h86fcf2b_0 conda-forge six 1.17.0 pyhd8ed1ab_0 conda-forge tblib 3.1.0 pyhd8ed1ab_0 conda-forge tk 8.6.13 noxft_hd72426e_102 conda-forge toml 0.10.2 pyhd8ed1ab_1 conda-forge tomli 2.2.1 pyhd8ed1ab_1 conda-forge typing_extensions 4.14.0 pyhe01879c_0 conda-forge tzdata 2025b h78e105d_0 conda-forge ucx 1.18.1 h1369271_0 conda-forge xarray 2025.6.1 pyhd8ed1ab_1 conda-forge xorg-libxau 1.0.12 hb9d3cd8_0 conda-forge xorg-libxdmcp 1.1.5 hb9d3cd8_0 conda-forge zstd 1.5.7 hb8e6e7a_2 conda-forge

Traceback for error:

Traceback (most recent call last): File "/cosma/home/dp015/dc-lewi5/dedalus_experiments/spherical_shell/internally_heated/dedalus_example/ded_example.py", line 70, in grad_u = d3.grad(u) + rvec*lift(tau_u1) # First-order reduction ~~~~^~~~~~~~~~~~~ File "/cosma/home/dp015/dc-lewi5/miniforge3/envs/dedalus3_new/lib/python3.13/site-packages/dedalus/core/field.py", line 107, in mul return Multiply(self, other) File "/cosma/home/dp015/dc-lewi5/miniforge3/envs/dedalus3_new/lib/python3.13/site-packages/dedalus/tools/dispatch.py", line 44, in call return subclass(*args, **kw) File "/cosma/home/dp015/dc-lewi5/miniforge3/envs/dedalus3_new/lib/python3.13/site-packages/dedalus/tools/dispatch.py", line 23, in call return super().call(args, **kw) ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^ File "/cosma/home/dp015/dc-lewi5/miniforge3/envs/dedalus3_new/lib/python3.13/site-packages/dedalus/core/arithmetic.py", line 842, in init self.arg0_ghost_broadcaster = GhostBroadcaster(arg0.domain, self.dist.grid_layout, broadcast_dims) ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/cosma/home/dp015/dc-lewi5/miniforge3/envs/dedalus3_new/lib/python3.13/site-packages/dedalus/core/arithmetic.py", line 880, in init self.subcomm = domain.dist.comm_cart.Sub(remain_dims=deploy_dims) ~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^ File "src/mpi4py/MPI.src/Comm.pyx", line 3123, in mpi4py.MPI.Cartcomm.Sub File "src/mpi4py/MPI.src/asarray.pxi", line 54, in mpi4py.MPI.chkarray File "src/mpi4py/MPI.src/asarray.pxi", line 47, in mpi4py.MPI.getarray TypeError: 'numpy.bool' object cannot be interpreted as an integer Traceback (most recent call last): File "/cosma/home/dp015/dc-lewi5/dedalus_experiments/spherical_shell/internally_heated/dedalus_example/ded_example.py", line 70, in grad_u = d3.grad(u) + rveclift(tau_u1) # First-order reduction ~~~~^~~~~~~~~~~~~ File "/cosma/home/dp015/dc-lewi5/miniforge3/envs/dedalus3_new/lib/python3.13/site-packages/dedalus/core/field.py", line 107, in mul return Multiply(self, other) File "/cosma/home/dp015/dc-lewi5/miniforge3/envs/dedalus3_new/lib/python3.13/site-packages/dedalus/tools/dispatch.py", line 44, in call return subclass(*args, **kw) File "/cosma/home/dp015/dc-lewi5/miniforge3/envs/dedalus3_new/lib/python3.13/site-packages/dedalus/tools/dispatch.py", line 23, in call return super().call(*args, **kw) ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^ File "/cosma/home/dp015/dc-lewi5/miniforge3/envs/dedalus3_new/lib/python3.13/site-packages/dedalus/core/arithmetic.py", line 842, in init self.arg0_ghost_broadcaster = GhostBroadcaster(arg0.domain, self.dist.grid_layout, broadcast_dims) ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/cosma/home/dp015/dc-lewi5/miniforge3/envs/dedalus3_new/lib/python3.13/site-packages/dedalus/core/arithmetic.py", line 880, in init self.subcomm = domain.dist.comm_cart.Sub(remain_dims=deploy_dims) ~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^ File "src/mpi4py/MPI.src/Comm.pyx", line 3123, in mpi4py.MPI.Cartcomm.Sub File "src/mpi4py/MPI.src/asarray.pxi", line 54, in mpi4py.MPI.chkarray File "src/mpi4py/MPI.src/asarray.pxi", line 47, in mpi4py.MPI.getarray TypeError: 'numpy.bool' object cannot be interpreted as an integer

ntlewis avatar Jun 17 '25 14:06 ntlewis

Hi ntlewis,

Just to let you know, I have installed the latest version of Dedalus 3.0.3 with Python 3.10 and NumPy 1.23.0, as there appears to be a compatibility issue between Python 3.13 and NumPy 2.3.0, specifically with np.bool, which was apparently deprecated for Numpy>1.23.5, where the alias has not been removed. Probably, for a permanent solution, the Dedalus team should replace the usage of np.bool with the appropriate alternative. I'm using Python 3.10 + numpy 1.23.5, and my codes run as usual within my old Dedalus setup.

I hope this helps you.

Best, Entropic

EntropicPhys avatar Jun 23 '25 11:06 EntropicPhys

@EntropicPhys Thanks for this! I'll give it a whirl at some point to see if it resolves my issue (I expect it will). I don't have an immediate need, as I have a Dedalus 3.0.2 install that works fine :)

ntlewis avatar Jun 23 '25 14:06 ntlewis

Hi all, Just confirming that I encountered this issue as well with a new/updated dedalus installation through conda. This is not unique to the shell convection example, but also occurs with the Cartesian Rayleigh-Bénard convection script in parallel. As @EntropicPhys suggested, this appears to specifically be a NumPy 2.3 deprecation issue. I can get a working environment with the command

conda install "numpy<2.3"

even on Python 3.13.

chowland avatar Jun 25 '25 14:06 chowland

Thanks everyone, this should be fixed in https://github.com/DedalusProject/dedalus/commit/4bd724dc8389a31156baf09d0269e2bac32c2516, and we'll release a patch update to pip soon.

kburns avatar Jul 10 '25 20:07 kburns

@kburns hello! The error is still there if install by conda install. Could you please update packege?

rodionstepanov avatar Sep 03 '25 09:09 rodionstepanov

Yes sorry for the delay, I'll put out a new release today.

kburns avatar Sep 03 '25 10:09 kburns

@kburns Now can be updated with pip install --upgrade dedalus but it does not run because of error

ValueError: mpi4py.MPI.Session size changed, may indicate binary incompatibility. Expected 40 from C header, got 32 from PyObject

Still something is wrong with conda install

rodionstepanov avatar Sep 04 '25 09:09 rodionstepanov

The conda feedstock has not fully updated yet, but will soon. I think the other error is not from Dedalus but indicates an incompatibility between your system MPI and mpi4py version. Maybe try updating mpi4py?

kburns avatar Sep 04 '25 10:09 kburns

Maybe try updating mpi4py?

Latest version 4.1.0 is installed. I cannot find in Dedalus documetation which version of MPI is needed. There are many modules availeble on cluster mpi/2021.2.0 mpich/3.4.2-gcc9 mpip/3.5 openmpi/4.1.1-gcc openmpi/4.1.1-icc openmpi/5.0.8-gcc

rodionstepanov avatar Sep 04 '25 11:09 rodionstepanov

Dedalus has loose MPI requirements. If you get this error when you just import mpi4py (and not dedalus), then it is a compatibility issue between the cluster MPI and the mpi4py build, and I'd suggest looking for help on the mpi4py forums. You might also try downgrading mpi4py<4.

kburns avatar Sep 04 '25 11:09 kburns