hls4ml
hls4ml copied to clipboard
TypeError after converting YOLOv3 Model.
Hello,
As I am trying to convert YOLOv3 using HLS4ML, I added UpSampling2D layers to the config file.
After running hls4ml.converters.keras_to_hls(cfg), I obtain the following error:
TypeError Traceback (most recent call last)
~/hls4ml/hls4ml/converters/keras_to_hls.py in keras_to_hls(config) 307 input_names = None 308 --> 309 layer, output_shape = layer_handlers[keras_class](keras_layer, input_names, input_shapes, reader, config) 310 311 print('Layer name: {}, layer type: {}, input shapes: {}, output shape: {}'.format(layer['name'], layer['class_name'], input_shapes, output_shape))
~/hls4ml/hls4ml/converters/keras/convolution.py in parse_conv2d_layer(keras_layer, input_names, input_shapes, data_reader, config) 75 layer['stride_width'], 76 layer['filt_height'], ---> 77 layer['filt_width'] 78 ) 79
~/hls4ml/hls4ml/converters/utils.py in compute_padding_2d(pad_type, in_height, in_width, stride_height, stride_width, filt_height, filt_width) 41 if pad_type.lower() == 'same': 42 #Height ---> 43 out_height = int(math.ceil(float(in_height) / float(stride_height))) 44 if (in_height % stride_height == 0): 45 pad_along_height = max(filt_height - stride_height, 0)
TypeError: float() argument must be a string or a number, not 'NoneType'
Note that I am running it as:
import hls4ml import plotting
hls4ml.model.optimizer.OutputRoundingSaturationMode.layers = ['Activation'] hls4ml.model.optimizer.OutputRoundingSaturationMode.rounding_mode = 'AP_RND' hls4ml.model.optimizer.OutputRoundingSaturationMode.saturation_mode = 'AP_SAT'
hls_config = hls4ml.utils.config_from_keras_model(model, granularity='name')
hls_config['Model']['Precision'] = 'ap_fixed<16,6>' hls_config['Model']['ReuseFactor'] = 1
for Layer in hls_config['LayerName'].keys(): hls_config['LayerName'][Layer]['Strategy'] = 'Resource' hls_config['LayerName'][Layer]['ReuseFactor'] = 1 hls_config['LayerName']['conv_105']['Strategy'] = 'Stable' plotting.print_dict(hls_config)
cfg = hls4ml.converters.create_config(backend='Vivado') cfg['IOType'] = 'io_stream' # Must set this if using CNNs! cfg['HLSConfig'] = hls_config cfg['KerasModel'] = model cfg['OutputDir'] = 'pruned_cnn/' cfg['XilinxPart'] = 'xcu250-figd2104-2L-e'
hls_model = hls4ml.converters.keras_to_hls(cfg) hls_model.compile()
And my model.summary() looks like this:
Model: "functional_1"
Layer (type) Output Shape Param # Connected to
input_1 (InputLayer) [(None, None, None, 0
conv_0 (Conv2D) (None, None, None, 3 864 input_1[0][0]
bnorm_0 (BatchNormalization) (None, None, None, 3 128 conv_0[0][0]
leaky_0 (LeakyReLU) (None, None, None, 3 0 bnorm_0[0][0]
zero_padding2d (ZeroPadding2D) (None, None, None, 3 0 leaky_0[0][0]
conv_1 (Conv2D) (None, None, None, 6 18432 zero_padding2d[0][0]
bnorm_1 (BatchNormalization) (None, None, None, 6 256 conv_1[0][0]
leaky_1 (LeakyReLU) (None, None, None, 6 0 bnorm_1[0][0]
conv_2 (Conv2D) (None, None, None, 3 2048 leaky_1[0][0]
bnorm_2 (BatchNormalization) (None, None, None, 3 128 conv_2[0][0]
leaky_2 (LeakyReLU) (None, None, None, 3 0 bnorm_2[0][0]
conv_3 (Conv2D) (None, None, None, 6 18432 leaky_2[0][0]
bnorm_3 (BatchNormalization) (None, None, None, 6 256 conv_3[0][0]
leaky_3 (LeakyReLU) (None, None, None, 6 0 bnorm_3[0][0]
add (Add) (None, None, None, 6 0 leaky_1[0][0]
leaky_3[0][0]
zero_padding2d_1 (ZeroPadding2D (None, None, None, 6 0 add[0][0]
conv_5 (Conv2D) (None, None, None, 1 73728 zero_padding2d_1[0][0]
bnorm_5 (BatchNormalization) (None, None, None, 1 512 conv_5[0][0]
leaky_5 (LeakyReLU) (None, None, None, 1 0 bnorm_5[0][0]
conv_6 (Conv2D) (None, None, None, 6 8192 leaky_5[0][0]
bnorm_6 (BatchNormalization) (None, None, None, 6 256 conv_6[0][0]
leaky_6 (LeakyReLU) (None, None, None, 6 0 bnorm_6[0][0]
conv_7 (Conv2D) (None, None, None, 1 73728 leaky_6[0][0]
bnorm_7 (BatchNormalization) (None, None, None, 1 512 conv_7[0][0]
leaky_7 (LeakyReLU) (None, None, None, 1 0 bnorm_7[0][0]
add_1 (Add) (None, None, None, 1 0 leaky_5[0][0]
leaky_7[0][0]
conv_9 (Conv2D) (None, None, None, 6 8192 add_1[0][0]
bnorm_9 (BatchNormalization) (None, None, None, 6 256 conv_9[0][0]
leaky_9 (LeakyReLU) (None, None, None, 6 0 bnorm_9[0][0]
conv_10 (Conv2D) (None, None, None, 1 73728 leaky_9[0][0]
bnorm_10 (BatchNormalization) (None, None, None, 1 512 conv_10[0][0]
leaky_10 (LeakyReLU) (None, None, None, 1 0 bnorm_10[0][0]
add_2 (Add) (None, None, None, 1 0 add_1[0][0]
leaky_10[0][0]
zero_padding2d_2 (ZeroPadding2D (None, None, None, 1 0 add_2[0][0]
conv_12 (Conv2D) (None, None, None, 2 294912 zero_padding2d_2[0][0]
bnorm_12 (BatchNormalization) (None, None, None, 2 1024 conv_12[0][0]
leaky_12 (LeakyReLU) (None, None, None, 2 0 bnorm_12[0][0]
conv_13 (Conv2D) (None, None, None, 1 32768 leaky_12[0][0]
bnorm_13 (BatchNormalization) (None, None, None, 1 512 conv_13[0][0]
leaky_13 (LeakyReLU) (None, None, None, 1 0 bnorm_13[0][0]
conv_14 (Conv2D) (None, None, None, 2 294912 leaky_13[0][0]
bnorm_14 (BatchNormalization) (None, None, None, 2 1024 conv_14[0][0]
leaky_14 (LeakyReLU) (None, None, None, 2 0 bnorm_14[0][0]
add_3 (Add) (None, None, None, 2 0 leaky_12[0][0]
leaky_14[0][0]
conv_16 (Conv2D) (None, None, None, 1 32768 add_3[0][0]
bnorm_16 (BatchNormalization) (None, None, None, 1 512 conv_16[0][0]
leaky_16 (LeakyReLU) (None, None, None, 1 0 bnorm_16[0][0]
conv_17 (Conv2D) (None, None, None, 2 294912 leaky_16[0][0]
bnorm_17 (BatchNormalization) (None, None, None, 2 1024 conv_17[0][0]
leaky_17 (LeakyReLU) (None, None, None, 2 0 bnorm_17[0][0]
add_4 (Add) (None, None, None, 2 0 add_3[0][0]
leaky_17[0][0]
conv_19 (Conv2D) (None, None, None, 1 32768 add_4[0][0]
bnorm_19 (BatchNormalization) (None, None, None, 1 512 conv_19[0][0]
leaky_19 (LeakyReLU) (None, None, None, 1 0 bnorm_19[0][0]
conv_20 (Conv2D) (None, None, None, 2 294912 leaky_19[0][0]
bnorm_20 (BatchNormalization) (None, None, None, 2 1024 conv_20[0][0]
leaky_20 (LeakyReLU) (None, None, None, 2 0 bnorm_20[0][0]
add_5 (Add) (None, None, None, 2 0 add_4[0][0]
leaky_20[0][0]
conv_22 (Conv2D) (None, None, None, 1 32768 add_5[0][0]
bnorm_22 (BatchNormalization) (None, None, None, 1 512 conv_22[0][0]
leaky_22 (LeakyReLU) (None, None, None, 1 0 bnorm_22[0][0]
conv_23 (Conv2D) (None, None, None, 2 294912 leaky_22[0][0]
bnorm_23 (BatchNormalization) (None, None, None, 2 1024 conv_23[0][0]
leaky_23 (LeakyReLU) (None, None, None, 2 0 bnorm_23[0][0]
add_6 (Add) (None, None, None, 2 0 add_5[0][0]
leaky_23[0][0]
conv_25 (Conv2D) (None, None, None, 1 32768 add_6[0][0]
bnorm_25 (BatchNormalization) (None, None, None, 1 512 conv_25[0][0]
leaky_25 (LeakyReLU) (None, None, None, 1 0 bnorm_25[0][0]
conv_26 (Conv2D) (None, None, None, 2 294912 leaky_25[0][0]
bnorm_26 (BatchNormalization) (None, None, None, 2 1024 conv_26[0][0]
leaky_26 (LeakyReLU) (None, None, None, 2 0 bnorm_26[0][0]
add_7 (Add) (None, None, None, 2 0 add_6[0][0]
leaky_26[0][0]
conv_28 (Conv2D) (None, None, None, 1 32768 add_7[0][0]
bnorm_28 (BatchNormalization) (None, None, None, 1 512 conv_28[0][0]
leaky_28 (LeakyReLU) (None, None, None, 1 0 bnorm_28[0][0]
conv_29 (Conv2D) (None, None, None, 2 294912 leaky_28[0][0]
bnorm_29 (BatchNormalization) (None, None, None, 2 1024 conv_29[0][0]
leaky_29 (LeakyReLU) (None, None, None, 2 0 bnorm_29[0][0]
add_8 (Add) (None, None, None, 2 0 add_7[0][0]
leaky_29[0][0]
conv_31 (Conv2D) (None, None, None, 1 32768 add_8[0][0]
bnorm_31 (BatchNormalization) (None, None, None, 1 512 conv_31[0][0]
leaky_31 (LeakyReLU) (None, None, None, 1 0 bnorm_31[0][0]
conv_32 (Conv2D) (None, None, None, 2 294912 leaky_31[0][0]
bnorm_32 (BatchNormalization) (None, None, None, 2 1024 conv_32[0][0]
leaky_32 (LeakyReLU) (None, None, None, 2 0 bnorm_32[0][0]
add_9 (Add) (None, None, None, 2 0 add_8[0][0]
leaky_32[0][0]
conv_34 (Conv2D) (None, None, None, 1 32768 add_9[0][0]
bnorm_34 (BatchNormalization) (None, None, None, 1 512 conv_34[0][0]
leaky_34 (LeakyReLU) (None, None, None, 1 0 bnorm_34[0][0]
conv_35 (Conv2D) (None, None, None, 2 294912 leaky_34[0][0]
bnorm_35 (BatchNormalization) (None, None, None, 2 1024 conv_35[0][0]
leaky_35 (LeakyReLU) (None, None, None, 2 0 bnorm_35[0][0]
add_10 (Add) (None, None, None, 2 0 add_9[0][0]
leaky_35[0][0]
zero_padding2d_3 (ZeroPadding2D (None, None, None, 2 0 add_10[0][0]
conv_37 (Conv2D) (None, None, None, 5 1179648 zero_padding2d_3[0][0]
bnorm_37 (BatchNormalization) (None, None, None, 5 2048 conv_37[0][0]
leaky_37 (LeakyReLU) (None, None, None, 5 0 bnorm_37[0][0]
conv_38 (Conv2D) (None, None, None, 2 131072 leaky_37[0][0]
bnorm_38 (BatchNormalization) (None, None, None, 2 1024 conv_38[0][0]
leaky_38 (LeakyReLU) (None, None, None, 2 0 bnorm_38[0][0]
conv_39 (Conv2D) (None, None, None, 5 1179648 leaky_38[0][0]
bnorm_39 (BatchNormalization) (None, None, None, 5 2048 conv_39[0][0]
leaky_39 (LeakyReLU) (None, None, None, 5 0 bnorm_39[0][0]
add_11 (Add) (None, None, None, 5 0 leaky_37[0][0]
leaky_39[0][0]
conv_41 (Conv2D) (None, None, None, 2 131072 add_11[0][0]
bnorm_41 (BatchNormalization) (None, None, None, 2 1024 conv_41[0][0]
leaky_41 (LeakyReLU) (None, None, None, 2 0 bnorm_41[0][0]
conv_42 (Conv2D) (None, None, None, 5 1179648 leaky_41[0][0]
bnorm_42 (BatchNormalization) (None, None, None, 5 2048 conv_42[0][0]
leaky_42 (LeakyReLU) (None, None, None, 5 0 bnorm_42[0][0]
add_12 (Add) (None, None, None, 5 0 add_11[0][0]
leaky_42[0][0]
conv_44 (Conv2D) (None, None, None, 2 131072 add_12[0][0]
bnorm_44 (BatchNormalization) (None, None, None, 2 1024 conv_44[0][0]
leaky_44 (LeakyReLU) (None, None, None, 2 0 bnorm_44[0][0]
conv_45 (Conv2D) (None, None, None, 5 1179648 leaky_44[0][0]
bnorm_45 (BatchNormalization) (None, None, None, 5 2048 conv_45[0][0]
leaky_45 (LeakyReLU) (None, None, None, 5 0 bnorm_45[0][0]
add_13 (Add) (None, None, None, 5 0 add_12[0][0]
leaky_45[0][0]
conv_47 (Conv2D) (None, None, None, 2 131072 add_13[0][0]
bnorm_47 (BatchNormalization) (None, None, None, 2 1024 conv_47[0][0]
leaky_47 (LeakyReLU) (None, None, None, 2 0 bnorm_47[0][0]
conv_48 (Conv2D) (None, None, None, 5 1179648 leaky_47[0][0]
bnorm_48 (BatchNormalization) (None, None, None, 5 2048 conv_48[0][0]
leaky_48 (LeakyReLU) (None, None, None, 5 0 bnorm_48[0][0]
add_14 (Add) (None, None, None, 5 0 add_13[0][0]
leaky_48[0][0]
conv_50 (Conv2D) (None, None, None, 2 131072 add_14[0][0]
bnorm_50 (BatchNormalization) (None, None, None, 2 1024 conv_50[0][0]
leaky_50 (LeakyReLU) (None, None, None, 2 0 bnorm_50[0][0]
conv_51 (Conv2D) (None, None, None, 5 1179648 leaky_50[0][0]
bnorm_51 (BatchNormalization) (None, None, None, 5 2048 conv_51[0][0]
leaky_51 (LeakyReLU) (None, None, None, 5 0 bnorm_51[0][0]
add_15 (Add) (None, None, None, 5 0 add_14[0][0]
leaky_51[0][0]
conv_53 (Conv2D) (None, None, None, 2 131072 add_15[0][0]
bnorm_53 (BatchNormalization) (None, None, None, 2 1024 conv_53[0][0]
leaky_53 (LeakyReLU) (None, None, None, 2 0 bnorm_53[0][0]
conv_54 (Conv2D) (None, None, None, 5 1179648 leaky_53[0][0]
bnorm_54 (BatchNormalization) (None, None, None, 5 2048 conv_54[0][0]
leaky_54 (LeakyReLU) (None, None, None, 5 0 bnorm_54[0][0]
add_16 (Add) (None, None, None, 5 0 add_15[0][0]
leaky_54[0][0]
conv_56 (Conv2D) (None, None, None, 2 131072 add_16[0][0]
bnorm_56 (BatchNormalization) (None, None, None, 2 1024 conv_56[0][0]
leaky_56 (LeakyReLU) (None, None, None, 2 0 bnorm_56[0][0]
conv_57 (Conv2D) (None, None, None, 5 1179648 leaky_56[0][0]
bnorm_57 (BatchNormalization) (None, None, None, 5 2048 conv_57[0][0]
leaky_57 (LeakyReLU) (None, None, None, 5 0 bnorm_57[0][0]
add_17 (Add) (None, None, None, 5 0 add_16[0][0]
leaky_57[0][0]
conv_59 (Conv2D) (None, None, None, 2 131072 add_17[0][0]
bnorm_59 (BatchNormalization) (None, None, None, 2 1024 conv_59[0][0]
leaky_59 (LeakyReLU) (None, None, None, 2 0 bnorm_59[0][0]
conv_60 (Conv2D) (None, None, None, 5 1179648 leaky_59[0][0]
bnorm_60 (BatchNormalization) (None, None, None, 5 2048 conv_60[0][0]
leaky_60 (LeakyReLU) (None, None, None, 5 0 bnorm_60[0][0]
add_18 (Add) (None, None, None, 5 0 add_17[0][0]
leaky_60[0][0]
zero_padding2d_4 (ZeroPadding2D (None, None, None, 5 0 add_18[0][0]
conv_62 (Conv2D) (None, None, None, 1 4718592 zero_padding2d_4[0][0]
bnorm_62 (BatchNormalization) (None, None, None, 1 4096 conv_62[0][0]
leaky_62 (LeakyReLU) (None, None, None, 1 0 bnorm_62[0][0]
conv_63 (Conv2D) (None, None, None, 5 524288 leaky_62[0][0]
bnorm_63 (BatchNormalization) (None, None, None, 5 2048 conv_63[0][0]
leaky_63 (LeakyReLU) (None, None, None, 5 0 bnorm_63[0][0]
conv_64 (Conv2D) (None, None, None, 1 4718592 leaky_63[0][0]
bnorm_64 (BatchNormalization) (None, None, None, 1 4096 conv_64[0][0]
leaky_64 (LeakyReLU) (None, None, None, 1 0 bnorm_64[0][0]
add_19 (Add) (None, None, None, 1 0 leaky_62[0][0]
leaky_64[0][0]
conv_66 (Conv2D) (None, None, None, 5 524288 add_19[0][0]
bnorm_66 (BatchNormalization) (None, None, None, 5 2048 conv_66[0][0]
leaky_66 (LeakyReLU) (None, None, None, 5 0 bnorm_66[0][0]
conv_67 (Conv2D) (None, None, None, 1 4718592 leaky_66[0][0]
bnorm_67 (BatchNormalization) (None, None, None, 1 4096 conv_67[0][0]
leaky_67 (LeakyReLU) (None, None, None, 1 0 bnorm_67[0][0]
add_20 (Add) (None, None, None, 1 0 add_19[0][0]
leaky_67[0][0]
conv_69 (Conv2D) (None, None, None, 5 524288 add_20[0][0]
bnorm_69 (BatchNormalization) (None, None, None, 5 2048 conv_69[0][0]
leaky_69 (LeakyReLU) (None, None, None, 5 0 bnorm_69[0][0]
conv_70 (Conv2D) (None, None, None, 1 4718592 leaky_69[0][0]
bnorm_70 (BatchNormalization) (None, None, None, 1 4096 conv_70[0][0]
leaky_70 (LeakyReLU) (None, None, None, 1 0 bnorm_70[0][0]
add_21 (Add) (None, None, None, 1 0 add_20[0][0]
leaky_70[0][0]
conv_72 (Conv2D) (None, None, None, 5 524288 add_21[0][0]
bnorm_72 (BatchNormalization) (None, None, None, 5 2048 conv_72[0][0]
leaky_72 (LeakyReLU) (None, None, None, 5 0 bnorm_72[0][0]
conv_73 (Conv2D) (None, None, None, 1 4718592 leaky_72[0][0]
bnorm_73 (BatchNormalization) (None, None, None, 1 4096 conv_73[0][0]
leaky_73 (LeakyReLU) (None, None, None, 1 0 bnorm_73[0][0]
add_22 (Add) (None, None, None, 1 0 add_21[0][0]
leaky_73[0][0]
conv_75 (Conv2D) (None, None, None, 5 524288 add_22[0][0]
bnorm_75 (BatchNormalization) (None, None, None, 5 2048 conv_75[0][0]
leaky_75 (LeakyReLU) (None, None, None, 5 0 bnorm_75[0][0]
conv_76 (Conv2D) (None, None, None, 1 4718592 leaky_75[0][0]
bnorm_76 (BatchNormalization) (None, None, None, 1 4096 conv_76[0][0]
leaky_76 (LeakyReLU) (None, None, None, 1 0 bnorm_76[0][0]
conv_77 (Conv2D) (None, None, None, 5 524288 leaky_76[0][0]
bnorm_77 (BatchNormalization) (None, None, None, 5 2048 conv_77[0][0]
leaky_77 (LeakyReLU) (None, None, None, 5 0 bnorm_77[0][0]
conv_78 (Conv2D) (None, None, None, 1 4718592 leaky_77[0][0]
bnorm_78 (BatchNormalization) (None, None, None, 1 4096 conv_78[0][0]
leaky_78 (LeakyReLU) (None, None, None, 1 0 bnorm_78[0][0]
conv_79 (Conv2D) (None, None, None, 5 524288 leaky_78[0][0]
bnorm_79 (BatchNormalization) (None, None, None, 5 2048 conv_79[0][0]
leaky_79 (LeakyReLU) (None, None, None, 5 0 bnorm_79[0][0]
conv_84 (Conv2D) (None, None, None, 2 131072 leaky_79[0][0]
bnorm_84 (BatchNormalization) (None, None, None, 2 1024 conv_84[0][0]
leaky_84 (LeakyReLU) (None, None, None, 2 0 bnorm_84[0][0]
up_sampling2d (UpSampling2D) (None, None, None, 2 0 leaky_84[0][0]
concatenate (Concatenate) (None, None, None, 7 0 up_sampling2d[0][0]
add_18[0][0]
conv_87 (Conv2D) (None, None, None, 2 196608 concatenate[0][0]
bnorm_87 (BatchNormalization) (None, None, None, 2 1024 conv_87[0][0]
leaky_87 (LeakyReLU) (None, None, None, 2 0 bnorm_87[0][0]
conv_88 (Conv2D) (None, None, None, 5 1179648 leaky_87[0][0]
bnorm_88 (BatchNormalization) (None, None, None, 5 2048 conv_88[0][0]
leaky_88 (LeakyReLU) (None, None, None, 5 0 bnorm_88[0][0]
conv_89 (Conv2D) (None, None, None, 2 131072 leaky_88[0][0]
bnorm_89 (BatchNormalization) (None, None, None, 2 1024 conv_89[0][0]
leaky_89 (LeakyReLU) (None, None, None, 2 0 bnorm_89[0][0]
conv_90 (Conv2D) (None, None, None, 5 1179648 leaky_89[0][0]
bnorm_90 (BatchNormalization) (None, None, None, 5 2048 conv_90[0][0]
leaky_90 (LeakyReLU) (None, None, None, 5 0 bnorm_90[0][0]
conv_91 (Conv2D) (None, None, None, 2 131072 leaky_90[0][0]
bnorm_91 (BatchNormalization) (None, None, None, 2 1024 conv_91[0][0]
leaky_91 (LeakyReLU) (None, None, None, 2 0 bnorm_91[0][0]
conv_96 (Conv2D) (None, None, None, 1 32768 leaky_91[0][0]
bnorm_96 (BatchNormalization) (None, None, None, 1 512 conv_96[0][0]
leaky_96 (LeakyReLU) (None, None, None, 1 0 bnorm_96[0][0]
up_sampling2d_1 (UpSampling2D) (None, None, None, 1 0 leaky_96[0][0]
concatenate_1 (Concatenate) (None, None, None, 3 0 up_sampling2d_1[0][0]
add_10[0][0]
conv_99 (Conv2D) (None, None, None, 1 49152 concatenate_1[0][0]
bnorm_99 (BatchNormalization) (None, None, None, 1 512 conv_99[0][0]
leaky_99 (LeakyReLU) (None, None, None, 1 0 bnorm_99[0][0]
conv_100 (Conv2D) (None, None, None, 2 294912 leaky_99[0][0]
bnorm_100 (BatchNormalization) (None, None, None, 2 1024 conv_100[0][0]
leaky_100 (LeakyReLU) (None, None, None, 2 0 bnorm_100[0][0]
conv_101 (Conv2D) (None, None, None, 1 32768 leaky_100[0][0]
bnorm_101 (BatchNormalization) (None, None, None, 1 512 conv_101[0][0]
leaky_101 (LeakyReLU) (None, None, None, 1 0 bnorm_101[0][0]
conv_102 (Conv2D) (None, None, None, 2 294912 leaky_101[0][0]
bnorm_102 (BatchNormalization) (None, None, None, 2 1024 conv_102[0][0]
leaky_102 (LeakyReLU) (None, None, None, 2 0 bnorm_102[0][0]
conv_103 (Conv2D) (None, None, None, 1 32768 leaky_102[0][0]
bnorm_103 (BatchNormalization) (None, None, None, 1 512 conv_103[0][0]
leaky_103 (LeakyReLU) (None, None, None, 1 0 bnorm_103[0][0]
conv_80 (Conv2D) (None, None, None, 1 4718592 leaky_79[0][0]
conv_92 (Conv2D) (None, None, None, 5 1179648 leaky_91[0][0]
conv_104 (Conv2D) (None, None, None, 2 294912 leaky_103[0][0]
bnorm_80 (BatchNormalization) (None, None, None, 1 4096 conv_80[0][0]
bnorm_92 (BatchNormalization) (None, None, None, 5 2048 conv_92[0][0]
bnorm_104 (BatchNormalization) (None, None, None, 2 1024 conv_104[0][0]
leaky_80 (LeakyReLU) (None, None, None, 1 0 bnorm_80[0][0]
leaky_92 (LeakyReLU) (None, None, None, 5 0 bnorm_92[0][0]
leaky_104 (LeakyReLU) (None, None, None, 2 0 bnorm_104[0][0]
conv_81 (Conv2D) (None, None, None, 2 261375 leaky_80[0][0]
conv_93 (Conv2D) (None, None, None, 2 130815 leaky_92[0][0]
conv_105 (Conv2D) (None, None, None, 2 65535 leaky_104[0][0]
Total params: 62,001,757 Trainable params: 61,949,149 Non-trainable params: 52,608
Thank you very much for your help !
That looks like an issue in handling of UpSampling2D layer. We'll investigate. Unrelated to that, a model with 60 million parameters won't work in hls4ml. You'll have to reduce your model size and quantize it.