Erica
Erica
Hi, when do gen_map on validation dataset, some datas occurs error like this:  I found the reason is data's brain region is lower than patch size:  My pacth...
Definition of all_stages_loss is: self.all_stages_loss = 0.4 * self.total_wght_loss + 0.8 * self.stage2_loss + self.stage3_loss, why use 0.4、0.8、1 as the three stage's loss coefficient?
In your code file "oprations.py", you extract some layers for fine-tuning, and I found these layers only from the first mode unet. Does it means just do fine-tuning for unet...
Sorry, I really hope you can share your dataset. Thank you very much!
arguments ratio_gan2seg : trade-coefficient between GAN loss and segmentation loss discriminator : type of a discriminator (pixel or patch1 or patch2 or image) I really don't know how to put...