ChengShen
ChengShen
Hi, In the original paper, batch norm is a good tool to get better performance. However, in my case, the DnCNN model without BN is superior to the one with...
``` dist = torch.distributions.one_hot_categorical.OneHotCategorical(torch.tensor([0.1]*10)) lossf = MutualInformationPenalty() z = torch.randn(2, 100) c_cont = torch.randn(2, 30) c_dis = dist.sample([2]) print(c_dis) G = InfoGANGenerator(10, 30) D = InfoGANDiscriminator(10, 30) fake = G(z,...
Hi, I notice that you did not use a pretrained model. Is it too hard to train? Or is its performation not good? Thanks.
Hi, I was searching for vanilla GAN and found this page. I do not know what does it mean. Could you please tell me? Thanks.
Hi, Could you please release the pre-trained model on SIDD?
https://github.com/ashawkey/stable-dreamfusion/blob/main/guidance/sd_utils.py#L112 I am encountering confusion at line 112 in the `sd_utils.py` file. Based on my understanding, the CFG should be structured as follows:  This structure corresponds to the following...
Hello, I noticed the input image is not normoalized to range [-1, 1], has the model trained with data that is not in this range?. And is double precision required?