recurrent-memory-transformer-pytorch icon indicating copy to clipboard operation
recurrent-memory-transformer-pytorch copied to clipboard

Question: how to adapt this for CTC loss

Open pfeatherstone opened this issue 1 year ago • 2 comments

@lucidrains Do you have any advice on how to adapt RecurrentMemoryTransformerWrapper such that it works with CTC ?

pfeatherstone avatar Aug 08 '23 08:08 pfeatherstone

In the memory replay backpropagation algorithm, the labels are partitioned in the same way as the logits. The loss is evaluated per block. For CTC that doesn't make sense since labels are not necessarily aligned.... So does memory replay in its current form even apply to CTC?? Any help is gratefully received.

pfeatherstone avatar Aug 08 '23 13:08 pfeatherstone

@lucidrains Or if we forget CTC, can you think of a way to make this work with unaligned targets ?

pfeatherstone avatar Aug 22 '23 08:08 pfeatherstone