occupancy_networks icon indicating copy to clipboard operation
occupancy_networks copied to clipboard

influence of diffrent normalization

Open hua-wu-que opened this issue 4 years ago • 2 comments

Hi there,

I want to ask your advice regarding the normalization layers. I am implementing a network similar to yours, but I found if I use batch norm layer, during testing time model.eval() will give much worse results. Did you encounter this before? Thank you for your reply!

hua-wu-que avatar May 15 '20 19:05 hua-wu-que

I did several experiments about BN layers for OccNet. With BN layers, the training will be much stable and the final reconstruction results will be more 'average' and generally well. Without BN layers you cannot easily train the network(the loss fluctuated) and final results are statistically worse.

AlexsaseXie avatar May 16 '20 03:05 AlexsaseXie

I did several experiments about BN layers for OccNet. With BN layers, the training will be much stable and the final reconstruction results will be more 'average' and generally well. Without BN layers you cannot easily train the network(the loss fluctuated) and final results are statistically worse.

Hi AlexsaseXie, did you call model.eval() before doing inference ? I get very bad results if I call model.eval()

hua-wu-que avatar May 16 '20 23:05 hua-wu-que