addons
addons copied to clipboard
Bug in WeightNormalization for Bidirectional RNN
code tf.keras.layers.Bidirectional(tfa.layers.WeightNormalization(tf.keras.layers.GRU(units_gru, return_sequences=True)), merge_mode='concat')
problem In Weight Normalization, there is no "go_backwards" argument, but Bidirectional needs such argument, so this line code met errors as following:
File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/layers/wrappers.py", line 422, in __init__
layer, go_backwards=True)
File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/layers/wrappers.py", line 481, in _recreate_layer_from_config
config['go_backwards'] = not config['go_backwards']
KeyError: 'go_backwards'