recurrent-batch-normalization-pytorch icon indicating copy to clipboard operation
recurrent-batch-normalization-pytorch copied to clipboard

sequence-wise normalization

Open sekigh opened this issue 6 years ago • 2 comments

Helow,

I am looking for pytorch code for LSTM batch normalization written as sequence-wise normalization in the Cooijmans' paper applicable to input with variable time length, say in Penn Treebank. I have your codes for s-MNIST/p-MNIST running on my local machine. I would like to expand that environment, hopefully, replacing SeparateBatchNorm1d with a new one. Thank you in advance.

Hiroshi

sekigh avatar Jun 25 '18 09:06 sekigh

Sorry for the late reply. As far as I understood, the paper states that an element whose time index is larger than T_max just uses the population statistics of time T_max. I think this feature is already implemented in the SeparateBatchNorm1d class.

jihunchoi avatar Oct 03 '18 12:10 jihunchoi

Thank you for your response. I just read your article. Sorry for late response. My application uses speech and actually employs many speech sequences with different time lengths. So hopefully, I prefer to sequence-wise batch normalization to frame-wise one with T_max expansion. It looks like there is no such a script. I understood.

sekigh avatar Dec 11 '18 10:12 sekigh