AI-Scientist
AI-Scientist copied to clipboard
Which versions of PyTorch and CUDA are being used?
The console reports a warning: " UserWarning: 1Torch was not compiled with flash attention." Maybe some mistake from the mismatched version of torch and cuda. My env config is "troch2.4.0+cuda12.4'"
Confirmed working using 2.0.1 and CUDA 12.2, I would look at the flash attention repos for advice on this issue!
Confirmed working using 2.0.1 and CUDA 12.2, I would look at the flash attention repos for advice on this issue!
Got it,thanks