DC-UNet
DC-UNet copied to clipboard
Why loss is not the same as in paperwork?
I was really looking forward to implement your DL model to my research, however, when I started to look through your code, I notice that your "code" loss (structure_loss) is not same as in paperwork (binary cross-entropy), so my question is what is Structure Loss and can I safely use BCEWithLogitsLoss instead of that? Thanks!