NanoRange icon indicating copy to clipboard operation
NanoRange copied to clipboard

Typos in insert_iterator?

Open eigenwhat opened this issue 4 years ago • 1 comments

Spotted a few things when trying to use range algorithms with a std::map. I'm guessing they're just copy/paste errors from being based off back_insert_iterator? ;)

This should be using insert and not push_back: https://github.com/tcbrindle/NanoRange/blob/bf32251d65673fe170d602777c087786c529ead8/include/nanorange/iterator/insert_iterator.hpp#L34-L39

Missing the iterator argument and constructing the wrong type: https://github.com/tcbrindle/NanoRange/blob/bf32251d65673fe170d602777c087786c529ead8/include/nanorange/iterator/insert_iterator.hpp#L51-L54

eigenwhat avatar Sep 15 '20 22:09 eigenwhat

Hi @pi-tau, when you use Adam, the gradients are scaled back to a unit norm, such that multiplying the loss by a constant does not make a difference. This is the case here for either taking the mean or the sum. For a VAE, you indeed have to sum the reconstruction loss over pixels to use the ELBO as objective.

phlippe avatar Jun 27 '23 12:06 phlippe

Hi, thanks for the useful info. I didn't know that. In this case, if training with SGD or SGD+Momentum, would you simply clip the grad norm to 1. ?

pi-tau avatar Jun 28 '23 10:06 pi-tau

If you train with SGD, then you either have to adjust your learning rate and grad norm, or scale the loss back to typical values, e.g. by taking the mean instead of the sum.

phlippe avatar Jun 28 '23 11:06 phlippe