pytorch-crf icon indicating copy to clipboard operation
pytorch-crf copied to clipboard

Issue about the backward()

Open clock-uni opened this issue 2 years ago • 1 comments

When I Calculate the loss using this moddle,the error is RuntimeError: one of the variables needed for gradient computation has been modified by an inplace operation . But When I Use CrossEntropyLoss it runs successfully.My code :

lossF = CRF(num_tags=num_tags, batch_first=True)
for batch in tqdm(trainloader, desc=f"Training Epoch {epoch}"):
            optimizer.zero_grad()
            inputs, targets = [x.to(device) for x in batch]
            if use_crf:
                targets = targets
            else:
                targets = targets.view(-1)
            if use_crf:
                bert_output = model(inputs, use_crf=use_crf)
                loss = lossF(bert_output, targets)
            loss.backward()

clock-uni avatar May 03 '22 18:05 clock-uni

I'm guessing you have in-place operations inside your model forward(). That error is usually caused by such operations.

kmkurn avatar May 15 '22 00:05 kmkurn