pedrofrodenas
pedrofrodenas
I think I found a solution: GlobalAveragePooling has an option to keep dims so that you can avoid using lambdas, here is the edited convert_global_avg_pool function, ``` def convert_global_avg_pool(node, params,...
In my case a had a problem with the Clip layer, I realized that Cliping equals applying a Relu activation function so I replace in operation_layers.py in line 37 the...
Yes, Indeed `keras.src.engine.keras_tensor` was moved to `from keras.src.backend import KerasTensor` . I already tried this but another errors arrise. During model conversion because `keras.backend.placeholder` was also removed in the newer...