SLBR-Visible-Watermark-Removal
SLBR-Visible-Watermark-Removal copied to clipboard
AUTOMATIC MIXED PRECISION - BCELOSS
Hey! Since the model is heavy, I was trying to apply Mixed Precision to reduce GPU usage. But after applying it I get the following error:
RuntimeError: torch.nn.functional.binary_cross_entropy and torch.nn.BCELoss are unsafe to autocast.
Many models use a sigmoid layer right before the binary cross entropy layer.
In this case, combine the two layers using torch.nn.functional.binary_cross_entropy_with_logits
or torch.nn.BCEWithLogitsLoss. binary_cross_entropy_with_logits and BCEWithLogits are
safe to autocast.
This error happens due to this :
File "/home/ubuntu/SLBR-Visible-Watermark-Removal/src/models/SLBR.py", line 136, in train
coarse_loss, refine_loss, style_loss, mask_loss = self.loss(
File "/opt/conda/envs/pytorch/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1102, in _call_impl
return forward_call(*input, **kwargs)
File "/home/ubuntu/SLBR-Visible-Watermark-Removal/src/models/SLBR.py", line 74, in forward
final_mask_loss += self.mask_loss(pred_ms[0], mask)
BCELoss is used here:
self.masked_l1_loss, self.mask_loss = l1_relative, nn.BCELoss()
Since this can be fixed by using torch.nn.BCEWithLogitsLoss. Can you please guide me from where should I remove the sigmoid so that I can use torch.nn.BCEWithLogitsLoss here
Hi, we use nn.Sigmoid at
- https://github.com/bcmi/SLBR-Visible-Watermark-Removal/blob/47c665f1855ab6624cd52b28cefa797a9c8b96f7/src/networks/blocks.py#L286
- https://github.com/bcmi/SLBR-Visible-Watermark-Removal/blob/47c665f1855ab6624cd52b28cefa797a9c8b96f7/src/networks/blocks.py#L290
Hey! Since the model is heavy, I was trying to apply Mixed Precision to reduce GPU usage. But after applying it I get the following error:
RuntimeError: torch.nn.functional.binary_cross_entropy and torch.nn.BCELoss are unsafe to autocast. Many models use a sigmoid layer right before the binary cross entropy layer. In this case, combine the two layers using torch.nn.functional.binary_cross_entropy_with_logits or torch.nn.BCEWithLogitsLoss. binary_cross_entropy_with_logits and BCEWithLogits are safe to autocast.
This error happens due to this :
File "/home/ubuntu/SLBR-Visible-Watermark-Removal/src/models/SLBR.py", line 136, in train coarse_loss, refine_loss, style_loss, mask_loss = self.loss( File "/opt/conda/envs/pytorch/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1102, in _call_impl return forward_call(*input, **kwargs) File "/home/ubuntu/SLBR-Visible-Watermark-Removal/src/models/SLBR.py", line 74, in forward final_mask_loss += self.mask_loss(pred_ms[0], mask)
BCELoss is used here:
self.masked_l1_loss, self.mask_loss = l1_relative, nn.BCELoss()
Since this can be fixed by using torch.nn.BCEWithLogitsLoss. Can you please guide me from where should I remove the sigmoid so that I can use torch.nn.BCEWithLogitsLoss here
Do you handle this? After use AUTOMATIC MIXED PRECISION, I get NaN loss one step based on the pre-train model. I do not know why, Can you share your mix precision code? Thx!