loss_dropper icon indicating copy to clipboard operation
loss_dropper copied to clipboard

hi

Open zmxsss opened this issue 3 years ago • 4 comments

If my loss is one-dimensional, do I need to become two-dimensional? use loss = loss.view(-1, batch_size)

zmxsss avatar Oct 28 '21 14:10 zmxsss

my loss.shape [1280.]

zmxsss avatar Oct 28 '21 14:10 zmxsss

With our implementation of loss dropping, we require a batch dimension. Doing that should work.

It should be a simple code change to deal with one-dimensional losses though!

ddkang avatar Oct 28 '21 16:10 ddkang

my code: loss = self.loss_fn(tgt_word_logprobs, tgt_word_labels) loss=loss.view(-1,bs) loss = loss.mean(dim=0) aa = self.dropper(loss) loss *= aa loss = loss.mean() loss = loss.view(-1) when run 6 epoch , error: RuntimeError: shape '[-1, 64]' is invalid for input of size 120, do you know why?

zmxsss avatar Oct 29 '21 05:10 zmxsss

I don't think you need the mean if it's already one dimensional.

ddkang avatar Oct 30 '21 17:10 ddkang