avatarify-python
avatarify-python copied to clipboard
CUDA error: CUBLAS_STATUS_ALLOC_FAILED after #350 (GTX 1650)
Describe the bug
run_windows.bat
crashes right after Calibrating the face pose (X key)
I git checkout
ed right before #350 and it still works fine.
To Reproduce
run run_windows.bat
while being on latest master and using a GTX 1650 SUPER
Info (please complete the following information):
-
OS (e.g., Linux): Windows
-
GPU model: GTX 1650 SUPER
-
Any other relevant information:
Traceback (most recent call last):
File "afy/cam_fomm.py", line 316, in <module>
out = predictor.predict(frame)
File "C:\Users\chrys\source\avatarify\afy\predictor_local.py", line 104, in predict
use_relative_jacobian=self.relative, adapt_movement_scale=self.adapt_movement_scale)
File "C:\Users\chrys\source\avatarify\afy\predictor_local.py", line 28, in normalize_kp
jacobian_diff = torch.matmul(kp_driving['jacobian'], torch.inverse(kp_driving_initial['jacobian']))
RuntimeError: CUDA error: CUBLAS_STATUS_ALLOC_FAILED when calling `cublasCreate(handle)`
Found people reporting the same error code over at TensorFlow, but I think alievk is the one who knows this area. https://github.com/tensorflow/tensorflow/issues/7072
Have you tried after rebooting? Seems to work for some people.
@JohanAR Yeah tried couple of things, none of them worked.
is this solved yet ?
Unfortunately it is difficult for us to do anything about it, since neither of us has a 1650 so we can't reproduce it. We might have to hope for someone to contribute a solution, the project is open source after all :)
Just as a data point, I am getting the same on a Quadro T2000, I think that is close to a 1600 series
To the people who are unable to run the python version of Avatarify because of this, please try the Windows app and see if that works better for you.
Is there a solution to this ? Same problem with Quadro T1000 and Quadro T2000, already reinstalled 3 times and did many more things, but it collapses after everytime i press "x".
Switching to the windows app fixed it for me. Performance is good, and we had a lot of fun with it already! Many thanks.