Keras-inference-time-optimizer
Keras-inference-time-optimizer copied to clipboard
Error with retinanet model
Hi, @ZFTurbo! I tryed convert retinanet model with backbone resnet50 and it faled. I used https://github.com/fizyr/keras-retinanet.
File "/usr/local/lib/python3.6/dist-packages/kito/__init__.py", line 330, in reduce_keras_model
new_layer = clone_model(layer)
File "/usr/local/lib/python3.6/dist-packages/keras/models.py", line 251, in clone_model
return _clone_functional_model(model, input_tensors=input_tensors)
File "/usr/local/lib/python3.6/dist-packages/keras/models.py", line 106, in _clone_functional_model
new_layer = layer.__class__.from_config(layer.get_config())
File "/usr/local/lib/python3.6/dist-packages/keras/engine/base_layer.py", line 1109, in from_config
return cls(**config)
File "/usr/local/lib/python3.6/dist-packages/keras/legacy/interfaces.py", line 91, in wrapper
return func(*args, **kwargs)
File "/usr/local/lib/python3.6/dist-packages/keras/layers/convolutional.py", line 490, in __init__
**kwargs)
File "/usr/local/lib/python3.6/dist-packages/keras/layers/convolutional.py", line 118, in __init__
self.bias_initializer = initializers.get(bias_initializer)
File "/usr/local/lib/python3.6/dist-packages/keras/initializers.py", line 508, in get
return deserialize(identifier)
File "/usr/local/lib/python3.6/dist-packages/keras/initializers.py", line 503, in deserialize
printable_module_name='initializer')
File "/usr/local/lib/python3.6/dist-packages/keras/utils/generic_utils.py", line 138, in deserialize_keras_object
': ' + class_name)
ValueError: Unknown initializer: PriorProbability
I would be grateful for any help.
Hi @ZFTurbo , great work here. I can see you have contributed in the keras-retinanet repo of fizyr too and also tried to support that here. Still facing the same issue as above. Any workaround for this?
@tonmoyborah
Current code supports retinanet. Here is the example. I will add it in test_bench later:
def get_RetinaNet_model():
from keras.models import load_model
from keras.utils import custom_object_scope
from keras_resnet.layers import BatchNormalization
from keras_retinanet.layers import UpsampleLike, Anchors, RegressBoxes, ClipBoxes, FilterDetections
from keras_retinanet.initializers import PriorProbability
custom_objects = {
'BatchNormalization': BatchNormalization,
'UpsampleLike': UpsampleLike,
'Anchors': Anchors,
'RegressBoxes': RegressBoxes,
'PriorProbability': PriorProbability,
'ClipBoxes': ClipBoxes,
'FilterDetections': FilterDetections,
}
with custom_object_scope(custom_objects):
model = load_model("../retinanet_resnet50_500_classes_0.4594_converted.h5")
return model, custom_objects
from keras.utils import custom_object_scope
model, custom_objects = get_RetinaNet_model()
with custom_object_scope(custom_objects):
model_reduced = reduce_keras_model(model)
I'll try this. I made it work by changing deserealize function in keras code but the resulting model didn't provide any speedup. Models other than retinanet are showing huge improvements
I observed the same behaviour for RetinaNet. While many BN layers were removed speed of inference stays the same.
P.S. Added RetinaNet in test_bench.