Jiangzy

Results 5 comments of Jiangzy

Hello, I've faced the same problem. After a few iterations, G is updated to 3.332902551362248e+44. Maybe there are some small issues in the G's code. In ADMM, the value of...

I think I have addressed the issues. In the iteration process, if G is higher than 1, the ADMM algorithm will result in exploding gradient problem. So just add torch.clamp(G,...

Gradient explosion happens when the first step of update_G or after some steps of update_G? Maybe you can also try to add .sign() to the G.grad.data and force the NAN...

> > I think I have addressed the issues. In the iteration process, if G is higher than 1, the ADMM algorithm will result in exploding gradient problem. So just...

Maybe you can try the following link: https://en.data-baker.com/datasets/freeDatasets/