addons icon indicating copy to clipboard operation
addons copied to clipboard

Bug in WeightNormalization for Bidirectional RNN

Open zzw922cn opened this issue 4 years ago • 0 comments

code tf.keras.layers.Bidirectional(tfa.layers.WeightNormalization(tf.keras.layers.GRU(units_gru, return_sequences=True)), merge_mode='concat')

problem In Weight Normalization, there is no "go_backwards" argument, but Bidirectional needs such argument, so this line code met errors as following:

 File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/layers/wrappers.py", line 422, in __init__
    layer, go_backwards=True)
  File "/usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/layers/wrappers.py", line 481, in _recreate_layer_from_config
    config['go_backwards'] = not config['go_backwards']
KeyError: 'go_backwards'

zzw922cn avatar Jul 29 '21 13:07 zzw922cn