Transformer-Hawkes-Process icon indicating copy to clipboard operation
Transformer-Hawkes-Process copied to clipboard

Error in Utils.time_loss

Open saurabhdash opened this issue 4 years ago • 2 comments

There is an error in the time_loss computation:

event_time[:, 1:] - event_time[:, :-1] causes an extra term to appear.

Example: If the sequence t has 3 events but gets 0 padded to have a length 5. t = [1, 3, 4, 0, 0] delta should be t = [2, 1, 0, 0]

but performing event_time[:, 1:] - event_time[:, :-1] leads to [2, 1, -4, 0].

Solution is to pass time_gap to this function instead of subtracting event_time. Alternatively, this difference can be multiplied by non_pad_mask to remove this artifact term.

saurabhdash avatar Mar 22 '21 20:03 saurabhdash

will you do the pull request?

unnir avatar May 31 '21 14:05 unnir

I agree with you. These calculations impact training as well because the RMSE loss is one of the loss functions being minimized overall. This raises serious doubts over the code and the reproducibility of the paper.

A simple fix to this could be

true = event_time[:, 1:] - event_time[:, :-1]
true = torch.where(true>=0., true, torch.zeros(true.shape).to(torch.device("cuda")))

ritvik06 avatar Jul 20 '21 10:07 ritvik06