Type Error when using AttentionLayer()
i got this error :
TypeError Traceback (most recent call last)
5 frames
/usr/local/lib/python3.7/dist-packages/keras/engine/keras_tensor.py in array(self, dtype)
253 def array(self, dtype=None):
254 raise TypeError(
--> 255 f'You are passing {self}, an intermediate Keras symbolic input/output, '
256 'to a TF API that does not allow registering custom dispatchers, such '
257 'as tf.cond, tf.function, gradient tapes, or tf.map_fn. '
TypeError: Exception encountered when calling layer "tf.keras.backend.rnn_6" (type TFOpLambda).
You are passing KerasTensor(type_spec=TensorSpec(shape=(None, 13), dtype=tf.float32, name=None), name='tf.compat.v1.nn.softmax_13/Softmax:0', description="created by layer 'tf.compat.v1.nn.softmax_13'"), an intermediate Keras symbolic input/output, to a TF API that does not allow registering custom dispatchers, such as tf.cond, tf.function, gradient tapes, or tf.map_fn. Keras Functional model construction only supports TF API calls that do support dispatching, such as tf.math.add or tf.reshape. Other APIs cannot be called directly on symbolic Kerasinputs/outputs. You can work around this limitation by putting the operation in a custom Keras layer call and calling that layer on this symbolic input/output.
Call arguments received by layer "tf.keras.backend.rnn_6" (type TFOpLambda):
• step_function=<function AttentionLayer.call.