xformers
xformers copied to clipboard
Install xformers without changing pytorch version
Hello,
Is there a way to install xformers without changing Pytorch? My torch version is [2.0.0a0+1767026]. And, other package, associate with training pipeline, depends on it. So it causes other errors f torch gets update.
I tried adding --no-deps, but found xformers doesn't install properly. (DualGemmSiluOp not found) I also tried download source code and build locally, but it takes long time to finish.
I only need to import xformers.ops.swiglu_op and won't expect entire xformers to work.
Thanks much! Allen
Most of the build time comes from flash attention, and you don't need it. Perhaps you could try building from source with setting the environment variable "XFORMERS_DISABLE_FLASH_ATTN" to "1".
Also xFormers now requires PyTorch 2.1+, so you won't be able to run the latest version: https://github.com/facebookresearch/xformers/blob/main/requirements.txt