Lux.jl
Lux.jl copied to clipboard
Meta Issue for proper Enzyme Integration into Lux
- CPU Support: We have tests covering several models.
- [x] PolyAlg selecting the correct broadcast mechanism for
fast_activation!!
fails https://github.com/EnzymeAD/Enzyme.jl/issues/1408 (Fixed upstream needs to be verified here) - [x] JVP fails for a simple Dense Layer https://github.com/LuxDL/Lux.jl/issues/628 (Needs Enzyme runtimeActivity enabled)
- [x] Dense works after https://github.com/LuxDL/LuxLib.jl/pull/68
- [ ] Simple Chains integration https://github.com/LuxDL/Lux.jl/issues/644
- [x] Broadcasting related (runtime activity) https://github.com/LuxDL/Lux.jl/issues/647
- [ ] ComponentArrays https://github.com/EnzymeAD/Enzyme.jl/issues/1447
- [x] Partially work -- see tests
- [ ] Recurrent Layers are broken
- [x] PolyAlg selecting the correct broadcast mechanism for
- CUDA Support
- [x] cuBLAS: https://github.com/EnzymeAD/Enzyme.jl/issues/1392
- [x] cuBLASLt: For the matmul kernels
- [x] https://github.com/EnzymeAD/Enzyme.jl/issues/1442
- [x] https://github.com/LuxDL/LuxLib.jl/issues/148
- AMDGPU Support
- [ ] rocBLAS
- Performance
- [ ] https://github.com/EnzymeAD/Enzyme.jl/issues/1434
Current Status: All (native lux) CPU Models should work. If not, that is a bug and please open issues.
@avik-pal the fast_activation!!
issue is closed, let me know if anything else is blocking there.
Yes, I have been playing around with it, so I will open any issue upstream as and when they show up.
After https://github.com/LuxDL/LuxLib.jl/pull/69 and https://github.com/LuxDL/Lux.jl/pull/640 are merged, larger models from Boltz.jl seem to work seamlessly :tada: