handson-ml3 icon indicating copy to clipboard operation
handson-ml3 copied to clipboard

[BUG] Chapter 17 - calculating latent loss generates ValueError: A KerasTensor cannot be used as input to a TensorFlow function.

Open NotFenir opened this issue 1 year ago • 2 comments

Hi everyone.

So I've tried to run the code with variational autoencoder, but there is an error like in title:

ValueError: A KerasTensor cannot be used as input to a TensorFlow function.

There is the code I'm trying to run (this code is from the book):

class Sampling(tf.keras.layers.Layer):
    def call(self, inputs):
        mean, log_var = inputs
        return tf.random.normal(tf.shape(log_var)) * tf.exp(log_var / 2) + mean

tf.random.set_seed(42)  # extra code – ensures reproducibility on CPU

codings_size = 10

inputs = tf.keras.layers.Input(shape=[28, 28])
Z = tf.keras.layers.Flatten()(inputs)
Z = tf.keras.layers.Dense(150, activation="relu")(Z)
Z = tf.keras.layers.Dense(100, activation="relu")(Z)
codings_mean = tf.keras.layers.Dense(codings_size)(Z)  # μ
codings_log_var = tf.keras.layers.Dense(codings_size)(Z)  # γ
codings = Sampling()([codings_mean, codings_log_var])
variational_encoder = tf.keras.Model(
    inputs=[inputs], outputs=[codings_mean, codings_log_var, codings])

decoder_inputs = tf.keras.layers.Input(shape=[codings_size])
x = tf.keras.layers.Dense(100, activation="relu")(decoder_inputs)
x = tf.keras.layers.Dense(150, activation="relu")(x)
x = tf.keras.layers.Dense(28 * 28)(x)
outputs = tf.keras.layers.Reshape([28, 28])(x)
variational_decoder = tf.keras.Model(inputs=[decoder_inputs], outputs=[outputs])

_, _, codings = variational_encoder(inputs)
reconstructions = variational_decoder(codings)
variational_ae = tf.keras.Model(inputs=[inputs], outputs=[reconstructions])

latent_loss = -0.5 * tf.reduce_sum(
    1 + codings_log_var - tf.exp(codings_log_var) - tf.square(codings_mean),
    axis=-1)

IDE suggests me to wrap this into the layer, and I've tried it but it still cannot work. There is my attempt to create this wrap.

class LatentLoss(keras.losses.Layer):
    def call(self, inputs):
        codings_mean, codings_log_var = inputs
        return -0.5 * tf.reduce_sum(
            1 + codings_log_var - tf.exp(codings_log_var) - tf.square(codings_mean),
            axis=-1
        )

NotFenir avatar Nov 26 '24 14:11 NotFenir

Okay, I see now that the code from book is for Keras 2. , my bad. But I'm very interesting how would it work with Keras 3. .

NotFenir avatar Nov 26 '24 15:11 NotFenir

Try this:

class LatentLossRegularizer(keras.layers.Layer):
    def call(self, X):
        codings_mean, codings_log_var = X
        latent_loss = -0.5 * tf.reduce_sum(1 + codings_log_var - tf.exp(codings_log_var) - tf.square(codings_mean), axis=-1)
        self.add_loss(tf.reduce_mean(latent_loss) / 784.)
        return X

then add this layer into the model:

#codings = Sampling()([codings_mean, codings_log_var])
m_v = LatentLossRegularizer()([codings_mean, codings_log_var])
codings = Sampling()(m_v)

If you also get these errors when running the fit():

UserWarning: The structure of `inputs` doesn't match the expected structure.

.. then you'll need to modify a few more lines to reflect the dimensionality of the inputs correctly. Look carefully whether inputs are specified as 'inputs=inputs' or 'inputs=[inputs]'). In particular if you don't correct the line:

variational_ae = tf.keras.Model(inputs=[inputs], outputs=[reconstructions])

then your model outputs will have an extra dimension and plot_reconstructions(variational_ae) won't work.

For keras version 3.9.2, the following code works without warnings:

inputs = tf.keras.layers.Input(shape=[28, 28])
Z = tf.keras.layers.Flatten()(inputs)
Z = tf.keras.layers.Dense(150, activation="relu")(Z)
Z = tf.keras.layers.Dense(100, activation="relu")(Z)
codings_mean = tf.keras.layers.Dense(codings_size,name="mean")(Z)  # μ
codings_log_var = tf.keras.layers.Dense(codings_size,name="var")(Z)  # γ
m_v = LatentLossRegularizer()([codings_mean, codings_log_var])
codings = Sampling()(m_v)
variational_encoder = tf.keras.Model(inputs=inputs, outputs=[codings_mean, codings_log_var, codings])

decoder_inputs = tf.keras.layers.Input(shape=[codings_size])
x = tf.keras.layers.Dense(100, activation="relu")(decoder_inputs)
x = tf.keras.layers.Dense(150, activation="relu")(x)
x = tf.keras.layers.Dense(28 * 28)(x)
outputs = tf.keras.layers.Reshape([28, 28])(x)
variational_decoder = tf.keras.Model(inputs=decoder_inputs, outputs=outputs)

_, _, codings = variational_encoder(inputs)
reconstructions = variational_decoder(codings)
variational_ae = tf.keras.Model(inputs=inputs, outputs=reconstructions)

mausmaux avatar May 07 '25 23:05 mausmaux

Thanks for opening this issue. I've updated notebooks 16 to 19 to use Keras 2 instead of 3, the problem should be fixed. Please reopen if you still see an issue.

ageron avatar Oct 14 '25 01:10 ageron