keras icon indicating copy to clipboard operation
keras copied to clipboard

Bug in functional model

Open Surya2k1 opened this issue 11 months ago • 3 comments

I think there seems a bug in functional model.

Case-1: When inputs=outputs for the model construction but training different output shape: Training success

import keras
from keras import layers
import numpy as np

input_1 = layers.Input(shape=(3,))
input_2 = layers.Input(shape=(5,))

model_1 = keras.models.Model([input_1, input_2], [input_1, input_2])
print(model_1.summary())
model_1.compile(optimizer='adam',metrics=['accuracy','accuracy'],loss=['mse'])

#Notice I am passing different output size for training but still training happens
model_1.fit([np.random.normal(size=(10,3)),np.random.normal(size=(10,5))], 
                    [np.random.normal(size=(10,1)),np.random.normal(size=(10,2))])

print('Training completed')

Case 2: Same as Case-1 but different behavior with different mismatched output shapes(than case-1) for training: Error during loss calculation. But I expect Error during graph execution itself.

#With diffrent output shapes than model constructed its raising error while calculating the loss.
#Instead it should have raised shape mismatch error during graph execution.
model_1.fit([np.random.normal(size=(10,3)),np.random.normal(size=(10,5))], 
                    [np.random.normal(size=(10,2)),np.random.normal(size=(10,4))])

Case 3: With Unconnected inputs and outputs

input_1 = layers.Input(shape=(3,))
input_2 = layers.Input(shape=(5,))

input_3 = layers.Input(shape=(1,))
input_4 = layers.Input(shape=(2,))

model_2 = keras.models.Model([input_1, input_2], [input_3, input_4])
model_2.compile(optimizer='adam',metrics=['accuracy','accuracy'],loss=['mse'])

#Passing correct input and ouputs fails because these are not connected.
model_2.fit([np.random.normal(size=(10,3)),np.random.normal(size=(10,5))], [np.random.normal(size=(10,1)),np.random.normal(size=(10,2))])

Got error below which is correct but it is not useful for end users. Instead it should have raised error during graph construction.

177         output_tensors = []
    178         for x in self.outputs:
--> 179             output_tensors.append(tensor_dict[id(x)])
    180 
    181         return tree.pack_sequence_as(self._outputs_struct, output_tensors)

KeyError: "Exception encountered when calling Functional.call().\n\n\x1b[1m139941182292272\x1b[0m\n\nArguments received by Functional.call():\n  • inputs=('tf.Tensor(shape=(None, 3), dtype=float32)', 'tf.Tensor(shape=(None, 5), dtype=float32)')\n  • training=True\n  • mask=('None', 'None')"

I tried to fix an issue similar to case-3 by raising Error during graph build itself in PR #20705 where I noticed this issue related to case1(From failed Test case). Please refer the gist.

Surya2k1 avatar Jan 03 '25 16:01 Surya2k1