KernelAbstractions.jl
KernelAbstractions.jl copied to clipboard
Error handling/reporting API
I noticed DiffEqGPU doing a plain error() in GPU code: https://github.com/SciML/DiffEqGPU.jl/blob/dddcb594ce054c0677bc1b18fdabca2fc0c2eaa9/src/perform_step/gpu_tsit5_perform_step.jl#L152
That's of course not great, and leads to inscrutable errors:
ERROR: a exception was thrown during kernel execution.
Run Julia on debug level 2 for device stack traces.
Running on -g2 doesn't actually help because the error function is not inlined, leading to multiple call sites.
KA.jl should probably offer an @error and @assert macro that display an error message and halt execution. On CUDA.jl the latter may be implemented using @cuassert (which has the annoying consequence of breaking CUDA, because it yields a sticky error, so we may not want to).