model-optimization icon indicating copy to clipboard operation
model-optimization copied to clipboard

Incorrect inputs reordering inside `ModelTransformer._get_layers` during pattern matchin

Open virtualphoton opened this issue 8 months ago • 2 comments

ModelTransformer._match_layer_with_inputs calls self._get_layers(input_layer_names). input_layer_names have strict order, i. e. _get_layers's result in this case must have same order of tensors as in input_layer_names. Current implementation is:

  def _get_layers(self, layer_names):
    return [
        layer for layer in self._config['layers']
        if layer['config']['name'] in layer_names
    ]

I. e. when first input is declared later than the second one, result would have incorrect order. The simple model to reproduce bug:

import tf_keras as K
import tf_keras.layers as L
a = K.Input(10)
b = L.Dense(10)(a)
c = K.Input(20)
m = K.Model([a, c], L.concatenate([c, b], -1))

Then quantize_model(m) would yield incorrect order for concatenation operation.

My suggestion would be to replace it with something like:

  def _get_layers(self, layer_names):
    name_to_layer = {layer['config']['name']: layer for layer in self._config['layers']}
    return [name_to_layer[name] for name in layer_names]

which preserves order of layer_names

This also seems to be the problem behind #1061

virtualphoton avatar Apr 22 '25 19:04 virtualphoton

Hey! Is this issue fairly beginner-friendly?

arvish avatar Apr 28 '25 20:04 arvish

Hey! Is this issue fairly beginner-friendly?

yes. I'm using patch from "My suggestion would be to replace it with something like:..." and it seems to work fine

virtualphoton avatar May 12 '25 13:05 virtualphoton