Haoquan Zhou
Results
1
comments of
Haoquan Zhou
> What about the warning: > > You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU...