Patrick von Platen
Patrick von Platen
Thanks a lot for the hint @captainnurple! I corrected the google colab, should work now :slightly_smiling_face:
Hey @sully90, The notebook works on the colab for me, but I haven't tested it on GCP. From the error message, it looks like there is a problem with `fp16`...
Could you guys make me a reproducible colab so that I can reproduce the error? :-) This would be great!
I can reproduce now! Thanks for telling me!
Not 100% sure what the error is for now- will take a look tomorrow!
Should be fixed now: https://colab.research.google.com/github/patrickvonplaten/notebooks/blob/master/Fine_tuning_Wav2Vec2_for_English_ASR.ipynb Can you try it out ? :-)
The problem was that the Transformers version that was used was too old. Didn't dive super deep into it though. Maybe updating your Transformers version should do the trick @jovan3600...
Hmmm, not really sure what could be the problem. In the new Transformers versions > 4.17 whenever the runtime is set to GPU, which can be checked with `torch.cuda.is_available()` then...
I think it has been solved in [kensho-technologies/pyctcdecode#41 ](https://github.com/kensho-technologies/pyctcdecode/issues/41) no? :-)
Hey @xzwworkplace, Could you try to provide a google colab that would allow me to reproduce the error? :-)