probabilistic_unet
probabilistic_unet copied to clipboard
Why do you call the loss function ELBO?
Thanks for awsome codes and paper. However, I cannot formulate the first term in ELBO as the cross-entropy function which used in your loss function.
What's more, it is a little strange to add the cross-entropy related to the segmentation result from Z_q. Because the Q is generated from the ground-truth, and the S_q is from Z_q. Is it meaningful to only calculate the CE(Y, S_q) rather than CE(Y, S_p)? I mean that the model has gotten the ground-truth in training phase, it is unfair to calculate CE for this model.
这是唯一一个tensorflow写的看不懂的代码,