-
Notifications
You must be signed in to change notification settings - Fork 336
Open
Labels
bugSomething isn't workingSomething isn't working
Description
ModelTransformer._match_layer_with_inputs calls self._get_layers(input_layer_names). input_layer_names have strict order, i. e. _get_layers's result in this case must have same order of tensors as in input_layer_names.
Current implementation is:
def _get_layers(self, layer_names):
return [
layer for layer in self._config['layers']
if layer['config']['name'] in layer_names
]
I. e. when first input is declared later than the second one, result would have incorrect order. The simple model to reproduce bug:
import tf_keras as K
import tf_keras.layers as L
a = K.Input(10)
b = L.Dense(10)(a)
c = K.Input(20)
m = K.Model([a, c], L.concatenate([c, b], -1))
Then quantize_model(m) would yield incorrect order for concatenation operation.
My suggestion would be to replace it with something like:
def _get_layers(self, layer_names):
name_to_layer = {layer['config']['name']: layer for layer in self._config['layers']}
return [name_to_layer[name] for name in layer_names]
which preserves order of layer_names
This also seems to be the problem behind #1061
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working