anomaly_transformer_pytorch
anomaly_transformer_pytorch copied to clipboard
Incorrect prior_association() ?
It seems the method prior_association() can not back propagate gradient to train Ws, or maybe I misunderstood? https://github.com/spencerbraun/anomaly_transformer_pytorch/blob/6d15200911260eee910a3664d70f07886c47708b/model.py#L41-L45
according to paper, is this the right way?
gaussian = 1 / math.sqrt(2 * math.pi) / self.sigma * torch.exp(- 0.5 * (p / self.sigma).pow(2))
Yes, I found the same problem. gaussian = 1 / math.sqrt(2 * math.pi) / self.sigma * torch.exp(- 0.5 * (p / self.sigma).pow(2)).And I emailed the author about it.
Does the self.sigma
need to be constrained positive? How to constrain it?
Yes, I found the same problem. gaussian = 1 / math.sqrt(2 * math.pi) / self.sigma * torch.exp(- 0.5 * (p / self.sigma).pow(2)).And I emailed the author about it.
I also encountered the same problem, did the author reply to you?
Yes, I found the same problem. gaussian = 1 / math.sqrt(2 * math.pi) / self.sigma * torch.exp(- 0.5 * (p / self.sigma).pow(2)).And I emailed the author about it.
I also encountered the same problem, did the author reply to you?
Yes, The author agrees it.