wizward123
Results
1
issues of
wizward123
https://github.com/facebookresearch/xformers/blob/main/examples/llama_inference/requirements.txt torch>=2.2.0 this should be a mistake we should use FlashAttention >=2.2.0