PhasedLSTM-Keras
PhasedLSTM-Keras copied to clipboard
Dropout during inference
Given that A Theoretically Grounded Application of Dropout in Recurrent Neural Networks was metioned in the code I expected that this implementation would allow the use of dropout during inference. But using the same keras, tensorflow and python versions and the same architecture with the default LSTM layer from keras and training=True
it works, and with the PhasedLSTM it doesn't work.
I am using: Keras==2.2.5 Tensorflow==1.15 Python==2.7 On Google Colab