Implicit-Competitive-Regularization icon indicating copy to clipboard operation
Implicit-Competitive-Regularization copied to clipboard

How to add condition loss into Total loss

Open Johnson-yue opened this issue 5 years ago • 0 comments

Hi, I train ACGAN with MNIST dataset. or Any Other Conditional GANs!!! As your method in ReadMe.md:

from optimizers import ACGD
device = torch.device('cuda:0')
lr = 0.0001
G = Generator()
D = Discriminator()
optimizer = ACGD(max_params=G, min_params=D, lr=lr, device=device)
# ACGD: Adaptive learning rates CGD;
for img in dataloader:
    d_real = D(img)
    z = torch.randn((batch_size, z_dim), device=device)
    d_fake = D(G(z))
    loss = criterion(d_real, d_fake)
    optimizer.zero_grad()
    optimizer.step(loss=loss)

But in CGANs, output of D maybe have Two like [d_real, cond_real] or [d_fake, cond_fake] They must be compute condition loss , such as using CrossEntropy :

self.optimizer = ACGD(max_params=self.G, min_params=self.D, lr=1e-4, device=torch.device("cuda:0"))

D_real, C_real = self.D(x_)
D_real_loss = self.BCE_loss(D_real, self.y_real_)
C_real_loss = self.CE_loss(C_real, torch.max(y_vec_, 1)[1])

G_ = self.G(z_, y_vec_)
D_fake, C_fake = self.D(G_)
D_fake_loss = self.BCE_loss(D_fake, self.y_fake_)
C_fake_loss = self.CE_loss(C_fake, torch.max(y_vec_, 1)[1])

D_loss = D_real_loss + C_real_loss + D_fake_loss + C_fake_loss

self.optimizer.zero_grad()
self.optimizer.step(loss=D_loss)

But the experiment is failed , so Would you provide a Demo for How to use in cGANs

Johnson-yue avatar Oct 25 '19 08:10 Johnson-yue