KernelAbstractions.jl
KernelAbstractions.jl copied to clipboard
Heterogeneous programming in Julia
Related to https://github.com/JuliaGPU/Metal.jl/issues/101 Using the idea from @N5N3 https://github.com/JuliaGPU/Metal.jl/issues/101#issuecomment-1447420196
Hi! I am trying to use KernelAbstractions.jl to work on a FDTD (Finite-Difference) simulation package. Overall, from what I understand, everything is fine except for multi-GPU. I am having trouble...
It's a bit annoying to not be able to use return statements. Currently you have to do something like this, but it looks weird: ```julia @kernel function ker!(args...) i =...
The kind of kernels that are expressible in KernelAbstractions are currently limited by the CPU support having to opperate on a macro level. With the CPU support being implemented in...
MWE, reduced from the tests: ```julia using KernelAbstractions @kernel function reduce_private(out, A) I = @index(Global, NTuple) i = @index(Local) priv = @private eltype(A) (1,) @inbounds begin priv[1] = zero(eltype(A)) for...
Hi, I noticed that the following script produces different results depending on the backend. On my machine, the output is: ```julia cpu: [18.0; 18.0; 18.0; 18.0; 18.0; 18.0; 18.0; 18.0;...
As the title suggests, there is a behaviour discrepancy between `import KernelAbstractions` and `using KernelAbstractions` in relation to the `@Const` macro. The following minimum example demonstrates the issue. ```julia import...
- **enable printf in errors again** - **turn off 1.10 testing**