debasishaimonk
Results
5
comments of
debasishaimonk
which attention did u use @Jim-Song ? is it Graves attention or Bahadanu attention?
has anyone worked on this anymore?
> Hi, I ported the FAN pytorch model into onnx for faster inference. Also I use numba to accelerate the post processing step. was it on large model?
@lifeiteng this was very much helpfull. Thanks for it!!
@leminhnguyen @EuphoriaCelestial what is the relevance of having only alphabets (i.e i am not saying AH,IY. But i am saying tokens that is made from the list alphabets.)