inception.torch icon indicating copy to clipboard operation
inception.torch copied to clipboard

Question on normalization in example.lua

Open felixsmueller opened this issue 8 years ago • 2 comments

Hi

sorry for bothering. In the example.lua I saw the following: img:mul(255):clamp(0, 255):add(-117) The mul(255) is to blow up the values from the range 0...1 to 0...255. The add(-117) is to remove the mean of all the imagenet images I suppose. I noticed that you do not divide by the std-deviation. Is this just a simplification for this example or not needed in general? If we should do the normalization, what value do you suggest to take (std_dev over all imagenet images)?

Regards, Felix

felixsmueller avatar Dec 10 '15 12:12 felixsmueller

@felixsmueller i do not divide by the std-deviation because that's what the google network seemed to do in training. Usually for all my trained networks, I normalize to 0-mean and stdv-1

soumith avatar Dec 10 '15 15:12 soumith

Thanks a lot.

felixsmueller avatar Dec 14 '15 12:12 felixsmueller