Keras-inference-time-optimizer icon indicating copy to clipboard operation
Keras-inference-time-optimizer copied to clipboard

Error with retinanet model

Open Maxfashko opened this issue 6 years ago • 4 comments

Hi, @ZFTurbo! I tryed convert retinanet model with backbone resnet50 and it faled. I used https://github.com/fizyr/keras-retinanet.

  File "/usr/local/lib/python3.6/dist-packages/kito/__init__.py", line 330, in reduce_keras_model
    new_layer = clone_model(layer)
  File "/usr/local/lib/python3.6/dist-packages/keras/models.py", line 251, in clone_model
    return _clone_functional_model(model, input_tensors=input_tensors)
  File "/usr/local/lib/python3.6/dist-packages/keras/models.py", line 106, in _clone_functional_model
    new_layer = layer.__class__.from_config(layer.get_config())
  File "/usr/local/lib/python3.6/dist-packages/keras/engine/base_layer.py", line 1109, in from_config
    return cls(**config)
  File "/usr/local/lib/python3.6/dist-packages/keras/legacy/interfaces.py", line 91, in wrapper
    return func(*args, **kwargs)
  File "/usr/local/lib/python3.6/dist-packages/keras/layers/convolutional.py", line 490, in __init__
    **kwargs)
  File "/usr/local/lib/python3.6/dist-packages/keras/layers/convolutional.py", line 118, in __init__
    self.bias_initializer = initializers.get(bias_initializer)
  File "/usr/local/lib/python3.6/dist-packages/keras/initializers.py", line 508, in get
    return deserialize(identifier)
  File "/usr/local/lib/python3.6/dist-packages/keras/initializers.py", line 503, in deserialize
    printable_module_name='initializer')
  File "/usr/local/lib/python3.6/dist-packages/keras/utils/generic_utils.py", line 138, in deserialize_keras_object
    ': ' + class_name)
ValueError: Unknown initializer: PriorProbability

I would be grateful for any help.

Maxfashko avatar Mar 03 '19 03:03 Maxfashko

Hi @ZFTurbo , great work here. I can see you have contributed in the keras-retinanet repo of fizyr too and also tried to support that here. Still facing the same issue as above. Any workaround for this?

tonmoyborah avatar Dec 13 '19 10:12 tonmoyborah

@tonmoyborah

Current code supports retinanet. Here is the example. I will add it in test_bench later:

def get_RetinaNet_model():
    from keras.models import load_model
    from keras.utils import custom_object_scope
    from keras_resnet.layers import BatchNormalization
    from keras_retinanet.layers import UpsampleLike, Anchors, RegressBoxes, ClipBoxes, FilterDetections
    from keras_retinanet.initializers import PriorProbability

    custom_objects = {
        'BatchNormalization': BatchNormalization,
        'UpsampleLike': UpsampleLike,
        'Anchors': Anchors,
        'RegressBoxes': RegressBoxes,
        'PriorProbability': PriorProbability,
        'ClipBoxes': ClipBoxes,
        'FilterDetections': FilterDetections,
    }

    with custom_object_scope(custom_objects):
        model = load_model("../retinanet_resnet50_500_classes_0.4594_converted.h5")
    return model, custom_objects

from keras.utils import custom_object_scope
model, custom_objects = get_RetinaNet_model()
with custom_object_scope(custom_objects):
    model_reduced = reduce_keras_model(model)

ZFTurbo avatar Dec 14 '19 14:12 ZFTurbo

I'll try this. I made it work by changing deserealize function in keras code but the resulting model didn't provide any speedup. Models other than retinanet are showing huge improvements

tonmoyborah avatar Dec 14 '19 15:12 tonmoyborah

I observed the same behaviour for RetinaNet. While many BN layers were removed speed of inference stays the same.

P.S. Added RetinaNet in test_bench.

ZFTurbo avatar Dec 15 '19 10:12 ZFTurbo