mmsegmentation icon indicating copy to clipboard operation
mmsegmentation copied to clipboard

AttributeError: 'float' object has no attribute 'backward'

Open junhaojia opened this issue 1 year ago • 1 comments

File "C:\Users\JJH\anaconda3\envs\mmcv\lib\site-packages\mmengine\optim\optimizer\optimizer_wrapper.py", line 220, in backward loss.backward(**kwargs)

When customizing a new loss function, this error occurs

The defined loss is:

Copyright (c) OpenMMLab. All rights reserved.

import torch.nn.functional as F

from typing import Optional, Union

from .utils import get_class_weight import torch.nn as nn from torch import Tensor

from mmseg.registry import MODELS from .utils import weight_reduce_loss import torch

def ces_loss(pred: Tensor, target: Tensor, weight: Optional[Tensor] = None, reduction: Union[str, None] = 'none', avg_factor: Optional[int] = None, ignore_index: Optional[int] = 255, alpha=0.5, gamma=2) -> Tensor: n, c, h, w = pred.size() nt, ht, wt = target.size() if h != ht and w != wt: pred = F.interpolate(pred, size=(ht, wt), mode="bilinear", align_corners=True)

temp_inputs = pred.transpose(1, 2).transpose(2, 3).contiguous().view(-1, c)
temp_target = target.view(-1)

logpt  = -nn.CrossEntropyLoss(ignore_index=ignore_index, reduction='none')(temp_inputs, temp_target)
pt = torch.exp(logpt)
if alpha is not None:
    logpt *= alpha
loss = -((1 - pt) ** gamma) * logpt
loss = loss.mean()
if weight is not None:
    weight = weight.float()

loss = weight_reduce_loss(loss, weight, reduction, avg_factor)
return loss

@MODELS.register_module() class MyLoss(nn.Module):

def __init__(self,
             reduction='mean',
             loss_weight=1.0,
             class_weight=None,
             loss_name='Loss_ces'):
    super().__init__()
    self.reduction = reduction
    self.loss_weight = loss_weight
    self._loss_name = loss_name
    self.class_weight = get_class_weight(class_weight)

def forward(
    self,
    pred,
    target,
    avg_factor=None,
    reduction_override=None,
    **kwargs
)-> Tensor:


    assert reduction_override in (None, 'none', 'mean', 'sum')
    reduction = (
        reduction_override if reduction_override else self.reduction)

    loss_cec = self.loss_weight * ces_loss(
        pred,
        target,
        reduction=reduction,
        avg_factor=avg_factor,
        **kwargs
    )

    return loss_cec

@property
def loss_name(self):

    return self._loss_name

junhaojia avatar Apr 15 '24 05:04 junhaojia

I was facing a similar issue, and after much digging, found out that parse_losses function under mmengine.model.base_model does a sum that considers only dictionary keys that satisfy if 'loss' in key: https://github.com/open-mmlab/mmengine/blob/main/mmengine/model/base_model/base_model.py#L174 .

Since the loss_name I had defined had "Loss" instead of "loss", it was being ignored 🤦🏾‍♂️. This meant the sum ended as 0.0, and 0.0 is a float for which there is no backward. It seems like your case has the same issue.

philadias avatar Jul 25 '24 13:07 philadias