xformers
xformers copied to clipboard
The original code directly disabled the support for fused swiglu, but pytorch2.0.0 has fixed the issue in autocast of bf16.
A small PR. This PR addresses the issue of disabling fused swiglu in swiglu_op.py. The original code directly disabled the support for fused swiglu, with the comment "Let's disable autocast in bf16 until this issue is fixed". After my testing, PyTorch 2.0.0 has already fixed this bug. Therefore, we should add a judgment for the PyTorch version, and then determine whether to support fused swiglu, instead of directly disabling swiglu as before. This makes it more user-friendly.