soft-dtw icon indicating copy to clipboard operation
soft-dtw copied to clipboard

Doesn't this loss function have the issue that the beginning time steps will get a much larger gradient than the final ones?

Open RuABraun opened this issue 3 years ago • 1 comments

I want to confirm that the issues I'm experiencing are a fundamental issue with the loss and not my implementation (which is a slight modification of this).

It seems to me that because the final loss is a sum of different paths, changing the (0, 0,) entry in the cost matrix will cause a much larger change in the loss than changing a later entry, as changing (0, 0,) influences every other entry in the cost matrix. Some simple test cases seem to confirm this. Can someone else confirm?

RuABraun avatar Mar 06 '21 18:03 RuABraun

I want to confirm that the issues I'm experiencing are a fundamental issue with the loss and not my implementation (which is a slight modification of this).

It seems to me that because the final loss is a sum of different paths, changing the (0, 0,) entry in the cost matrix will cause a much larger change in the loss than changing a later entry, as changing (0, 0,) influences every other entry in the cost matrix. Some simple test cases seem to confirm this. Can someone else confirm?

Have you investigated the most robust and right soft DTW? I have tried it but the GPU is out of memory and I just can only set the batch size =1

v-nhandt21 avatar Sep 23 '21 06:09 v-nhandt21