pytorch-baidu-ctc icon indicating copy to clipboard operation
pytorch-baidu-ctc copied to clipboard

documentation

Open phtephanx opened this issue 5 years ago • 4 comments

Thanks for the bindings!

I simply wanted to point out that you could add to the documentation:

  • y, xs and ys need to be on the cpu
  • x represents non-logits (no log-softmax applied)

phtephanx avatar Mar 12 '19 20:03 phtephanx

can I use this with the same input as torch.nn.CTCloss?

WenmuZhou avatar Jul 26 '19 11:07 WenmuZhou

Hi @WenmuZhou,

Not exactly. You need to make sure that you place the tensors in the appropriate devices.

Take a look at this piece of code, were I use both implementations of the CTC loss: https://github.com/jpuigcerver/PyLaia/blob/41d2cc41d742e7ab336393fde8f56585ff49ee52/laia/losses/ctc_loss.py#L350

jpuigcerver avatar Jul 27 '19 21:07 jpuigcerver

tks@jpuigcerver

WenmuZhou avatar Jul 29 '19 02:07 WenmuZhou

Hi, @jpuigcerver ask for one question: the input pred needs the log_softmax output?

WeihongM avatar Aug 22 '19 02:08 WeihongM