dfdx
dfdx copied to clipboard
Consider WebGPU backend for inference in future
Might be awesome to have inference in browser some day built with dfdx.
Also useful to have GPU acceleration that isn't specific to NVidia hardware, and WGPU should be able to provide that.
How does webgpu differ from opencl? Do they enable the same sorts of things? Would it be feasible to only implement one or the other and get either of the benefits?
Oh! The main advantage is WebGPU has some great Rust packages and runs on many device hardware and could also one day run a model’s inference in a browser. This might let dfdx be able to have a TensorflowJS equivalent one day.
On Thu, Mar 30, 2023 at 7:18 AM Corey Lowman @.***> wrote:
How does webgpu differ from opencl? Do they enable the same sorts of things? Would it be feasible to only implement one or the other and get either of the benefits?
— Reply to this email directly, view it on GitHub https://github.com/coreylowman/dfdx/issues/604#issuecomment-1490390624, or unsubscribe https://github.com/notifications/unsubscribe-auth/AACHZGUP3ACXUAZ3TKDMXLLW6WI3JANCNFSM6AAAAAAWFY2JKY . You are receiving this because you authored the thread.Message ID: @.***>
It does seem like webgpu supports compute kernels - will need to look into this further. If this can support web stuff and enables AMD gpus, I'm inclined to do this one over #597.
Does anyone know if there are any GEMM/BLAS libraries for webgpu? Probably need to ask the same question for OpenCL
Perhaps this project can provide valuable insights: https://github.com/webonnx/wonnx
Yes that is super helpful!!
It would be very nice to have it not only for inference but for training as well. There are people like me who don't like the quality of linux nvidia drivers and never buy their cards :)
I'm enjoying working with the code, and working on a Mac, wgpu should give metals support for free so would love to help with this. I've also messed around with wgpu before (but nothing serious).