keras-attention icon indicating copy to clipboard operation
keras-attention copied to clipboard

ValueError and TypeError in the custom_recurrents.py

Open 1922353531 opened this issue 5 years ago • 0 comments

Traceback (most recent call last): File "C:\Users\Bruce Rogers\Anaconda3\lib\site-packages\tensorflow\python\framework\op_def_library.py", line 455, in _apply_op_helper as_ref=input_arg.is_ref) File "C:\Users\Bruce Rogers\Anaconda3\lib\site-packages\tensorflow\python\framework\ops.py", line 1211, in internal_convert_n_to_tensor ctx=ctx)) File "C:\Users\Bruce Rogers\Anaconda3\lib\site-packages\tensorflow\python\framework\ops.py", line 1146, in internal_convert_to_tensor ret = conversion_func(value, dtype=dtype, name=name, as_ref=as_ref) File "C:\Users\Bruce Rogers\Anaconda3\lib\site-packages\tensorflow\python\framework\constant_op.py", line 229, in _constant_tensor_conversion_function return constant(v, dtype=dtype, name=name) File "C:\Users\Bruce Rogers\Anaconda3\lib\site-packages\tensorflow\python\framework\constant_op.py", line 208, in constant value, dtype=dtype, shape=shape, verify_shape=verify_shape)) File "C:\Users\Bruce Rogers\Anaconda3\lib\site-packages\tensorflow\python\framework\tensor_util.py", line 430, in make_tensor_proto raise ValueError("None values not supported.") ValueError: None values not supported.

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "C:/Users/Bruce Rogers/Artificial Intelligence/IMU Developer/conversation module/embedding+BidLstm+attentiondecoder/keras_FunctionAPI_model.py", line 18, in attention_decoder_outputs = AttentionDecoder(settings.LSTM_neurons, data_setting.output_vocab_size)(encoder_bid_lstm_outputs) File "C:\Users\Bruce Rogers\Anaconda3\lib\site-packages\keras\legacy\layers.py", line 513, in call return super(Recurrent, self).call(inputs, **kwargs) File "C:\Users\Bruce Rogers\Anaconda3\lib\site-packages\keras\engine\base_layer.py", line 457, in call output = self.call(inputs, **kwargs) File "C:\Users\Bruce Rogers\Artificial Intelligence\IMU Developer\conversation module\embedding+BidLstm+attentiondecoder\Attention.py", line 422, in call return super(AttentionDecoder, self).call(x) File "C:\Users\Bruce Rogers\Anaconda3\lib\site-packages\keras\legacy\layers.py", line 590, in call input_length=timesteps) File "C:\Users\Bruce Rogers\Anaconda3\lib\site-packages\keras\backend\tensorflow_backend.py", line 2922, in rnn outputs, _ = step_function(inputs[0], initial_states + constants) File "C:\Users\Bruce Rogers\Artificial Intelligence\IMU Developer\conversation module\embedding+BidLstm+attentiondecoder\Attention.py", line 466, in step _stm = K.repeat(stm, self.timesteps) File "C:\Users\Bruce Rogers\Anaconda3\lib\site-packages\keras\backend\tensorflow_backend.py", line 2137, in repeat pattern = tf.stack([1, n, 1]) File "C:\Users\Bruce Rogers\Anaconda3\lib\site-packages\tensorflow\python\ops\array_ops.py", line 874, in stack return gen_array_ops.pack(values, axis=axis, name=name) File "C:\Users\Bruce Rogers\Anaconda3\lib\site-packages\tensorflow\python\ops\gen_array_ops.py", line 5856, in pack "Pack", values=values, axis=axis, name=name) File "C:\Users\Bruce Rogers\Anaconda3\lib\site-packages\tensorflow\python\framework\op_def_library.py", line 483, in _apply_op_helper raise TypeError("%s that don't all match." % prefix) TypeError: Tensors in list passed to 'values' of 'Pack' Op have types [int32, <NOT CONVERTIBLE TO TENSOR>, int32] that don't all match.

1922353531 avatar Jul 21 '19 01:07 1922353531