Valentin Churavy
Valentin Churavy
> , but I'm not holding my breath. I would encourage you to contribute. Adding an intrinsic is a relatively small change.
I tried reproducing the Enzyme MWE with `examples/jit.jl` but the stars didn't align. @maleadt do you recall why you limited this to `toplevel` originally? I assume because it wouldn't make...
Hm... Doesn't the inner call resolve its own deferred codegen first and then both of them get linked into the outer one?
Hm I am a bit confused how https://github.com/JuliaGPU/CUDA.jl/blob/c2d444b0f5a76f92c5ba6bc1534a53319218b563/test/core/execution.jl#L974-L1002 works. I just pushed a commit that returns the job variable from the `!toplevel` compilation.
Yeah I can try to improve this while working on #582
Ah I see for https://github.com/JuliaGPU/CUDA.jl/blob/c2d444b0f5a76f92c5ba6bc1534a53319218b563/test/core/execution.jl#L974-L975 We have an infinite loop since we don't pass through the list of already codegen'd functions. So we recurse into the original top-level and carry...
This needs tests. Making it GPU only is fine given #533
> [!WARNING] > This pull request is not mergeable via GitHub because a downstack PR is open. Once all requirements are satisfied, merge this PR as a stack on Graphite....
@nanosoldier `runtests(ALL, vs = ":master", configuration = (buildflags=["LLVM_ASSERTIONS=1", "FORCE_ASSERTIONS=1"],), vs_configuration = (buildflags = ["LLVM_ASSERTIONS=1", "FORCE_ASSERTIONS=1"],))`
What prevents you from `eval`ing into the callers module? Or is this coming from a non-macro context?