DeepLearnToolbox icon indicating copy to clipboard operation
DeepLearnToolbox copied to clipboard

About back-propagate in cnnbp.m when average sub-sampling is used

Open poweic opened this issue 10 years ago • 0 comments

I noticed that you divide the error signal by scale^2 like this (in CNN/cnnbp.m, line 26)

... .* (expand(net.layers{l + 1}.d{j}, [s, s, 1]) / s ^ 2);   % where s is the scale

I know that In feedforward, the information summed over scale ^ 2 pixel is divided by scale ^ 2 because it's called "average" sub-sampling. But can you explain why the error signal is also divided by scale ^ 2 ?

I mean when Feed Forward, it looks like this (when s = 2)

1 1
1 1 => 1, because 1 = (1+1+1+1) / 4 

And in Back Propagate, why you choose this

0.25 0.25
0.25 0.25 <= 1

instead of this

1 1
1 1 <= 1, I thought 1 is the averaged error signal

Hope you can help me with this problem. Thanks : )

poweic avatar Dec 19 '14 08:12 poweic