James Schloss
James Schloss
Woops. Didn't mean to close it and don't have permission to reopen
yes, I was on master. I was using an i9-11900kb, with: ``` julia> device() ZeDevice(GPU, vendor 0x8086, device 0x9a60): Intel(R) UHD Graphics ``` As an interesting note, this error did...
Wait, I might be confused here. So my understanding was that: 1. The split operator method almost always refers to what's presented in the text as is. That is, a...
Just to write down my current understanding of the JLArray issue: ``` while d < items @synchronize() # legal since cpu=false ``` Is not valid for the CPU in KA...
So for `transform_cpu(...)`, it should be something like... ``` if force_fastmath push!(new_stmts, Expr(:macrocall, :@fastmath, arg2, arg3)) end ``` But I don't know what `arg2`, and `arg3` are I also don't...
tbh, I also have only had bad experiences with fastmath, so I am not sure if we should merge this in the end. I figured I would just get it...
I guess there is no fastmath flag equivalent for metal / parallel cpu backends, so it is hard to set this in a generic way for everyone. Maybe we should...
Thinking on this more, I think... 1. All primitivs are *still* FableOperators 2. the `generator` should create a specific `@generated` template to follow, so we can still do chaos game...
Ok, jotting down some more notes here. The generators should be `
Errors: ``` julia> @cuda threads = 1 call_fxs!((f, g)) ERROR: InvalidIRError: compiling MethodInstance for call_fxs!(::Tuple{typeof(f), typeof(g)}) resulted in invalid LLVM IR Reason: unsupported call to an unknown function (call to...