sigmanet icon indicating copy to clipboard operation
sigmanet copied to clipboard

Backpropagation proximal gradient

Open mblum94 opened this issue 3 years ago • 0 comments

Dear @khammernik ,

I've got a question/comment concerning back propagation of the proximal gradient layer with respect to lambda. I got curious, reading your current MRM paper where you wrote that training becomes unstable when lambda is not fixed.

Following your conventions, M := lambda A^HA + 1 Q := M^-1.

Now, the the derivative of the inverse of a matrix (with respect to lambda) is given by: Q'=-Q M' Q

In the code, I see twice the Q as expected but not M' = A^HA. Is it missing or does it cancel somehow?

Best regards, Moritz

mblum94 avatar Aug 05 '21 08:08 mblum94