keras-layer-normalization-rnn
keras-layer-normalization-rnn copied to clipboard
train timecost per epoch
@Binbose, firstly,thank u for your sharing project. I try the GRU_LN, with layer_to_normalize=("input_gate", "input_recurrent", "recurrent_gate" , "recurrent_recurrent") and normalize_seperately=False, but I found that the training timecost takes twice as before per epoch. Is there any solutions for this? Looking forward to your reply.