ERROR: LoadError: "failed to run pass manager on module" on Reactant script with Oceananigans
Running this script: https://github.com/DJ4Earth/Enzymanigans.jl/blob/jlk9/with-simulation/dynamical_core/autodiff_double_gyre_reactant.jl
Produces this error:
Which includes this file printed to disk:
@wsmoses
Raising failure, so moved here
looks like mergeparallelindices in affinecfg isn't applying to it for some reason
it doesn't like a compare against zero, here:
(base) wmoses@eduroamprvnat-172-16-1-99 Enzyme-JAX % ./bazel-bin/enzymexlamlir-opt --print-location ./raisef.mlir
float.jl:411:0: remark: "float.jl":411:0 (@*; in ./float.jl:0)
called from: "/Users/jkump/.julia/packages/Oceananigans/vDE82/src/Operators/interpolation_operators.jl":24:0 (@ℑyᵃᶜᵃ; in .//Users/jkump/.julia/packages/Oceananigans/vDE82/src/Operators/interpolation_operators.jl:0)
called from: "/Users/jkump/.julia/packages/Oceananigans/vDE82/src/Operators/interpolation_operators.jl":46:0 (@ℑxyᶠᶜᵃ; in .//Users/jkump/.julia/packages/Oceananigans/vDE82/src/Operators/interpolation_operators.jl:0)
called from: "/Users/jkump/.julia/packages/Oceananigans/vDE82/src/Operators/interpolation_operators.jl":128:0 (@active_weighted_ℑxyᶠᶜᶜ; in .//Users/jkump/.julia/packages/Oceananigans/vDE82/src/Operators/interpolation_operators.jl:0)
called from: "/Users/jkump/.julia/packages/Oceananigans/vDE82/src/Coriolis/hydrostatic_spherical_coriolis.jl":62:0 (@x_f_cross_U; in .//Users/jkump/.julia/packages/Oceananigans/vDE82/src/Coriolis/hydrostatic_spherical_coriolis.jl:0)
called from: "/Users/jkump/.julia/packages/Oceananigans/vDE82/src/Models/HydrostaticFreeSurfaceModels/hydrostatic_free_surface_tendency_kernel_functions.jl":47:0 (@hydrostatic_free_surface_u_velocity_tendency; in .//Users/jkump/.julia/packages/Oceananigans/vDE82/src/Models/HydrostaticFreeSurfaceModels/hydrostatic_free_surface_tendency_kernel_functions.jl:0)
called from: "/Users/jkump/.julia/packages/KernelAbstractions/sWSE0/src/macros.jl":322:0 (@gpu_compute_hydrostatic_free_surface_Gu!; in .//Users/jkump/.julia/packages/KernelAbstractions/sWSE0/src/macros.jl:0)
called from: "none":0:0 (@gpu_compute_hydrostatic_free_surface_Gu! in ./none:0)
/Users/jkump/.julia/packages/Oceananigans/vDE82/src/Operators/interpolation_operators.jl:24:0: note: called from
/Users/jkump/.julia/packages/Oceananigans/vDE82/src/Operators/interpolation_operators.jl:46:0: note: called from
/Users/jkump/.julia/packages/Oceananigans/vDE82/src/Operators/interpolation_operators.jl:128:0: note: called from
/Users/jkump/.julia/packages/Oceananigans/vDE82/src/Coriolis/hydrostatic_spherical_coriolis.jl:62:0: note: called from
/Users/jkump/.julia/packages/Oceananigans/vDE82/src/Models/HydrostaticFreeSurfaceModels/hydrostatic_free_surface_tendency_kernel_functions.jl:47:0: note: called from
/Users/jkump/.julia/packages/KernelAbstractions/sWSE0/src/macros.jl:322:0: note: called from
none:0:0: note: called from
float.jl:411:0: note: see current operation: %184 = arith.mulf %183, %cst_0 {enzyme.location = "\22float.jl\22:411:0 (@*; in ./float.jl:0)\0Acalled from: \22/Users/jkump/.julia/packages/Oceananigans/vDE82/src/Operators/interpolation_operators.jl\22:24:0 (@\E2\84\91y\E1\B5\83\E1\B6\9C\E1\B5\83; in .//Users/jkump/.julia/packages/Oceananigans/vDE82/src/Operators/interpolation_operators.jl:0)\0Acalled from: \22/Users/jkump/.julia/packages/Oceananigans/vDE82/src/Operators/interpolation_operators.jl\22:46:0 (@\E2\84\91xy\E1\B6\A0\E1\B6\9C\E1\B5\83; in .//Users/jkump/.julia/packages/Oceananigans/vDE82/src/Operators/interpolation_operators.jl:0)\0Acalled from: \22/Users/jkump/.julia/packages/Oceananigans/vDE82/src/Operators/interpolation_operators.jl\22:128:0 (@active_weighted_\E2\84\91xy\E1\B6\A0\E1\B6\9C\E1\B6\9C; in .//Users/jkump/.julia/packages/Oceananigans/vDE82/src/Operators/interpolation_operators.jl:0)\0Acalled from: \22/Users/jkump/.julia/packages/Oceananigans/vDE82/src/Coriolis/hydrostatic_spherical_coriolis.jl\22:62:0 (@x_f_cross_U; in .//Users/jkump/.julia/packages/Oceananigans/vDE82/src/Coriolis/hydrostatic_spherical_coriolis.jl:0)\0Acalled from: \22/Users/jkump/.julia/packages/Oceananigans/vDE82/src/Models/HydrostaticFreeSurfaceModels/hydrostatic_free_surface_tendency_kernel_functions.jl\22:47:0 (@hydrostatic_free_surface_u_velocity_tendency; in .//Users/jkump/.julia/packages/Oceananigans/vDE82/src/Models/HydrostaticFreeSurfaceModels/hydrostatic_free_surface_tendency_kernel_functions.jl:0)\0Acalled from: \22/Users/jkump/.julia/packages/KernelAbstractions/sWSE0/src/macros.jl\22:322:0 (@gpu_compute_hydrostatic_free_surface_Gu!; in .//Users/jkump/.julia/packages/KernelAbstractions/sWSE0/src/macros.jl:0)\0Acalled from: \22none\22:0:0 (@gpu_compute_hydrostatic_free_surface_Gu! in ./none:0)", enzyme.print_location, fastmathFlags = #llvm.fastmath<none>} : f64
#set = affine_set<(d0, d1, d2, d3) : (d2 + d1 * 16 >= 0, d1 * -16 - d2 + 61 >= 0, d0 + d3 * 16 >= 0, -d0 - d3 * 16 + 61 >= 0)>
cc @glwagner if you know what could be comparing against 0.0 float
Hmm, this looks pretty vanilla to me. @jlk9 can you run this with just one closure, or closure = nothing to check if that works?
I am also a little leery of no_slip_bc = ValueBoundaryCondition(0.0) --- maybe omit this too, or use an array of zeros / empty Field, which we know will work.
Also just as a matter of practice, we try to convert all floats to the right type under the hood (eg if using Float32), but the boundary condition is one case we don't handle right now (maybe we should), so you probably want to use no_slip_bc = ValueBoundaryCondition(0) (and also generally avoid hard code Float64 where possible).
There is a 0 comparison here
https://github.com/CliMA/Oceananigans.jl/blob/586ade72f25956e363de52ab1e566af49c5f21f0/src/Operators/interpolation_operators.jl#L121-L125
and related operators below. But, we compile through this successfully here for example: https://github.com/PRONTOLab/GB-25/blob/main/simulations/baroclinic_instability_simulation_run.jl
in fact a good way to work might be to start from the above case, and add complexity incrementally.
I tried closure = nothing, using 0 or an empty field instead of 0.0 in no_slip_bc, and omitting the boundary conditions altogether. Each of those still produced this error.
I think Greg's idea of starting from the successfully compiled case and adding complexity is good - I'll try building in that direction.
This is fixable from team kernel, but if it’s not GB blocking is probably not going to be fixed til after April 15.
Realize this is on the back burner until after 4/15, but I think I isolated the cause of this error. I created a similar Oceananigans + Reactant script here: https://github.com/DJ4Earth/Enzymanigans.jl/blob/jlk9/with-simulation/dynamical_core/reactant_attempt.jl
Here is where we create the LatitudeLongitudeGrid :
https://github.com/DJ4Earth/Enzymanigans.jl/blob/a352ccbcaf86275d8e9f6dac0b17d3d12671b750/dynamical_core/reactant_attempt.jl#L32C1-L36C6
grid = LatitudeLongitudeGrid(arch; size=(Nx, Ny, Nz), halo, z,
longitude = (0, 360), # Problem is here: when longitude is not periodic we get error
latitude = (15, 75),
topology = (Bounded, Bounded, Bounded)
)
Specifying the longitude is a Bounded topology produces the "failed to run pass manager on module" error as above. If we instead set topology = (Periodic, Bounded, Bounded), then this script runs without error.
This specific error output is no longer reproduced with the current versions of Reactant / Oceananigans, but we now get a "DimensionMismatch" error. I've made a new issue for it here: https://github.com/EnzymeAD/Reactant.jl/issues/1264.