coreference-resolution icon indicating copy to clipboard operation
coreference-resolution copied to clipboard

Error during training

Open sushantakpani opened this issue 4 years ago • 4 comments

I got this error while running python coref.py

loss = torch.sum(torch.log(torch.sum(torch.mul(probs, gold_indexes), dim=1).clamp_(eps, 1-eps), dim=0) * -1) TypeError: log() got an unexpected keyword argument 'dim'

sushantakpani avatar Mar 16 '20 22:03 sushantakpani

I have removed dim parameter in log()
#loss = torch.sum(torch.log(torch.sum(torch.mul(probs, gold_indexes), dim=1).clamp_(eps, 1-eps), dim=0) * -1) loss = torch.sum(torch.log(torch.sum(torch.mul(probs, gold_indexes)).clamp_(eps, 1-eps)) * -1)

Is it correct approach ?

sushantakpani avatar Mar 16 '20 22:03 sushantakpani

There is no dim parameter in torch.log() (https://pytorch.org/docs/stable/torch.html#torch.log)

torch.log(input, out=None) → Tensor Returns a new tensor with the natural logarithm of the elements of input. yi=loge (xi)

Parameters input (Tensor) – the input tensor. out (Tensor, optional) – the output tensor.

sushantakpani avatar Mar 16 '20 22:03 sushantakpani

I got this error while running python coref.py

loss = torch.sum(torch.log(torch.sum(torch.mul(probs, gold_indexes), dim=1).clamp_(eps, 1-eps), dim=0) * -1) TypeError: log() got an unexpected keyword argument 'dim'

I meet the same error,were you able to solve this error?

troublemaker-r avatar May 06 '20 06:05 troublemaker-r

It seems this is the correct way loss = torch.sum(torch.log(torch.sum(torch.mul(probs, gold_indexes), dim=1).clamp_(eps, 1-eps)), dim=0) * -1

sushantakpani avatar Jun 22 '20 18:06 sushantakpani