Yiming Wang
Yiming Wang
> Maybe something from the 1st minibatch is somehow being used for the second minibatch. Looks like the case: tot_scores torch.Size([128]) tot_scores torch.Size([128]) forward_scores torch.Size([14336]) backward_scores torch.Size([14336]) arc_scores torch.Size([21120]) out_grad...
> Also, I think you can remove `den_graph` to see if it crashes or not, if it does not crash, it may be something wrong when we do forward/backward on...
forward_scores's size is changing as it is obtained from the graph `den_graph_unrolled` which is the intersect of `den_graph` (not changing) and `dense_fsa_vec` (is changing as it is from the network's...
My confusion is: as `den_graph` is not changing, but it has to be created in the every forward pass rather than putting it in the constructor
I don't understand what the underlying purpose, but apparently at this line: https://github.com/k2-fsa/k2/blob/c6d658ea71676e820e5fd883ff57ec5963acef19/k2/python/csrc/torch/fsa.cu#L198 if `log_semiring` is True, `entering_arcs_tensor` in the returned pair is not initialized (or just all 0's)?
Ok trying
> Do the num and/or den graphs have epsilons at this point? Can you describe the epsilons they have, if so? > […](#) > On Tue, Dec 1, 2020 at...
OK. In terms of the composition/insection side of (num * den), as they are "transition ids" to be matched with the nnet output, the epsilon arcs were added when creating...
This is what I am going to do: before apply fsa operations that may affect epsilon or be affected by epsilon (e.g. remove_epsilon, intersect), the fsa.labels tensor are temporarily incremented...
@danpovey I got a preliminary result, which is ~15% EER. It is still high, but at least it seems to start working. Will continue trying to improve. In the meantime...