remat : The custom_gradient decorator currently supports keywords arguments only when eager execution is enabled.
It happens while using remat.
import keras
from keras import layers
import tensorflow as tf
import numpy as np
from keras import RematScope
def with_remat(mode):
with RematScope(mode=mode):
base_model = keras.applications.DenseNet121(
weights='imagenet',
input_shape=(224,224,3),
include_top=False
)
inputs = keras.Input(shape=(224,224,3))
x = base_model(inputs)
x = keras.layers.GlobalAveragePooling2D()(x)
outputs = keras.layers.Dense(10, activation='softmax')(x)
custom_model = keras.Model(inputs, outputs)
# bind all
custom_model.compile(
optimizer=keras.optimizers.Adam(),
loss=keras.losses.CategoricalCrossentropy(),
metrics=[
keras.metrics.TopKCategoricalAccuracy(k=3, name='acc_top3'),
keras.metrics.TopKCategoricalAccuracy(k=1, name='acc_top1')
]
)
# data
(x_train, y_train), (_, _) = keras.datasets.mnist.load_data()
x_train, y_train = x_train[:5000], y_train[:5000]
x_train = np.expand_dims(x_train, axis=-1)
x_train = np.repeat(x_train, 3, axis=-1)
x_train = x_train.astype('float32') / 255
x_train = tf.image.resize(x_train, [224,224])
y_train = tf.one_hot(y_train , depth=10)
custom_model.fit(x_train, y_train, batch_size=6, epochs=10, verbose = 1)
with_remat(mode='full')
---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
[<ipython-input-10-839581e84b79>](https://localhost:8080/#) in <cell line: 0>()
----> 1 with_remat(mode='full')
4 frames
[/usr/local/lib/python3.11/dist-packages/keras/src/utils/traceback_utils.py](https://localhost:8080/#) in error_handler(*args, **kwargs)
122 raise e.with_traceback(filtered_tb) from None
123 finally:
--> 124 del filtered_tb
125
126 return error_handler
ValueError: Exception encountered when calling Functional.call().
The custom_gradient decorator currently supports keywords arguments only when eager execution is enabled.
Arguments received by Functional.call():
• inputs=tf.Tensor(shape=(None, 224, 224, 3), dtype=float32)
• training=True
• mask=None
- What is the future of keras.application API?
- Does remat scope also work on data parallel or model parallel set up?
Hi @innat -
I have reproduced this issue with the latest version of keras(3.9.2). I have tried other modes as well, and it seems that the issue arises with full and larger_than modes. Other modes like activations and list_of_layers works fine. I am attaching gist for your reference. Thanks!
@sonali-kumari1 Thanks for your response.
"full": Apply rematerialization globally to all supported operations.
what are the supported operations here?
"list_of_layers": Apply rematerialization to a specific list of layer names.
In the gist you attached, you passed simply list_of_layers as mode but never specified any target layers, why and how the code works?
Hi @innat -
"full": Apply rematerialization globally to all supported operations. what are the supported operations here?
The documentation and source code does not seem to mention the supported operations.
"list_of_layers": Apply rematerialization to a specific list of layer names. In the gist you attached, you passed simply list_of_layers as mode but never specified any target layers, why and how the code works?
In my previous gist, I have not specified any layers for list_of_layers as mode, so it is likely not applying rematerialization to any layers. I am attaching an updated gist where I have used list_of_layers as mode and passed the layer names in layer_names like this:
with RematScope(mode=mode,layer_names=["dense_1"]):
@sonali-kumari1 Thanks for the update.
- For built-in models like
keras.applications.DenseNet121, I thinkmode=fullis more appropriate. Andmode=list_of_layerswithlayer_names=["dense_1"]is very specific, because the densenet model may not have any dense layer withdense_1name. - Using remat, I think the training time would be increased compared to vanila training.
What is the future of keras.application API?
KerasHub, generally speaking, but keras.applications should will to work indefinitely! Not sure about this bug in particular, assigning to @divyashreepathihalli who wrote remat.
@divyashreepathihalli A gentle reminder.