ACELoss
ACELoss copied to clipboard
how to use AC loss or ACE loss
Hello,
I tried to use AC or ACE losses instead of CE loss for the binary segmentation. Though I have used a certain network for the CE loss many times, the network does not work for AC/ACE losses.
In the meantime, I used a label array whose shape is equal to the prediction. Also, its channels have zero or one values like the code below.
from aceloss import ACLossV2 criterion = ACLossV2(classes=2) outputs = model(inputs) masks2 = torch.zeros_like(outputs) masks2[:, 0, :, :] = (masks == 0).squeeze(1) # shape: [batch size, channel size, width, height] masks2[:, 1, :, :] = (masks == 1).squeeze(1) loss = criterion(outputs, masks2)
The loss value is larger than 1e4, and the output for the prediction looks meaningless. Did I miss something? I didn't change any code for the ACELoss class. Thank you.
Hi, I don't know what's your problem you have met. In my understanding, the output needs to be seed into an activation function (softmax or sigmoid) and then calculate loss functions. Best, Xiangde.
Hello,
I tried to use AC or ACE losses instead of CE loss for the binary segmentation. Though I have used a certain network for the CE loss many times, the network does not work for AC/ACE losses.
In the meantime, I used a label array whose shape is equal to the prediction. Also, its channels have zero or one values like the code below.
from aceloss import ACLossV2 criterion = ACLossV2(classes=2) outputs = model(inputs) masks2 = torch.zeros_like(outputs) masks2[:, 0, :, :] = (masks == 0).squeeze(1) # shape: [batch size, channel size, width, height] masks2[:, 1, :, :] = (masks == 1).squeeze(1) loss = criterion(outputs, masks2)
The loss value is larger than 1e4, and the output for the prediction looks meaningless. Did I miss something? I didn't change any code for the ACELoss class. Thank you.
I also encountered the same problem. UNET + MSELoss works normally, but ACLoss does not converge, even when I used the pre-trained model.
I got same issue, the loss reduces slowly and IoU does not increase. I trained with BCELoss/CrossEntropyLoss before and my Network works well. Here is my code for Active Contour Loss
class ActiveContourLoss(torch.nn.Module):
def __init__(self, miu=1.0, numClasses=1):
super(ActiveContourLoss, self).__init__()
self.miu = miu
self.numClasses = numClasses
def forward(self, pred, mask):
'''
pred: prediction shape (B, numClasses, W, H)
mask: ground truth (B, W, H)
'''
if self.numClasses == 1:
pred = torch.sigmoid(pred)
else:
pred = torch.nn.functional.softmax(pred, dim=1)
min_pool_x = torch.nn.functional.max_pool2d(pred * -1, (3, 3), 1, 1) * -1
contour = torch.relu(torch.nn.functional.max_pool2d(
min_pool_x, (3, 3), 1, 1) - min_pool_x)
# length
length = torch.sum(torch.abs(contour))
# regions
label = torch.zeros_like(pred)
for k in range(self.numClasses):
value = k
if(self.numClasses == 1):
value = 1
label[:, k, :, :] = (mask == value)
label = label.float()
c_in = torch.ones_like(pred)
c_out = torch.zeros_like(pred)
region_in = torch.abs(torch.sum(pred * ((label - c_in) ** 2)))
region_out = torch.abs(torch.sum((1 - pred) * ((label - c_out) ** 2)))
region = self.miu * region_in + region_out
return (region + length)