hls4ml
hls4ml copied to clipboard
HLS model incompatible shapes with QKeras model
Hi there!
I am writing because of a problem I am facing during the conversion of my QKeras model to the HLS one.
Here is my Model:
The QKeras model works fine and I compared it with the reference model. However, when I convert it to HLS I get a different output shape, as follows:
Interpreting Model
Topology:
Layer name: in_layer, layer type: InputLayer, current shape: [[None, 240, 64]]
Layer name: reshape, layer type: Reshape, current shape: [[None, 240, 64]]
Layer name: conv2d_1, layer type: QConv2D, current shape: [[None, 240, 64, 1]]
Name: quantized_relu
Layer name: act_1, layer type: Activation, current shape: [[None, 240, 64, 16]]
Layer name: max_pooling2d, layer type: MaxPooling2D, current shape: [[None, 240, 64, 16]]
Layer name: conv2d_2, layer type: QConv2D, current shape: [[None, 240, 16, 16]]
Name: quantized_relu
Layer name: act_2, layer type: Activation, current shape: [[None, 240, 16, 16]]
Layer name: max_pooling2d_1, layer type: MaxPooling2D, current shape: [[None, 240, 16, 16]]
Layer name: conv2d_3, layer type: QConv2D, current shape: [[None, 240, 4, 16]]
Name: quantized_relu
Layer name: act_3, layer type: Activation, current shape: [[None, 240, 4, 16]]
Layer name: max_pooling2d_2, layer type: MaxPooling2D, current shape: [[None, 240, 4, 16]]
Layer name: reshape_1, layer type: Reshape, current shape: [[None, 240, 1, 16]]
Layer name: dense, layer type: QDense, current shape: [[None, 240, 16]]
Layer name: act_4, layer type: Activation, current shape: [[None, 1]]
As you can see, the output shape of the converted model is [[None, 1]], whereas the QKeras one is [None, 240, 1]. If I compare the output of each layer using the hls4ml profiling function with my testing data set I get this error:
tensorflow.python.framework.errors_impl.InvalidArgumentError: Incompatible shapes: [30,1] vs. [30,240,1] [Op:Sub]
Therefore, when I run the C simulation using the testbench in the project folder, I get different results.
My config is the following:
Backend: Vivado
ClockPeriod: 5
HLSConfig:
LayerName:
act_1:
Precision:
result: ap_fixed<9,1>
ReuseFactor: 1
Trace: true
act_2:
Precision:
result: ap_fixed<9,1>
ReuseFactor: 1
Trace: true
act_3:
Precision:
result: ap_fixed<9,1>
ReuseFactor: 1
Trace: true
act_4:
Precision: ap_fixed<16,6>
ReuseFactor: 8
Trace: true
table_size: 1024
table_t: ap_fixed<18,8>
conv2d_1:
Precision:
bias: ap_fixed<9,1>
weight: ap_fixed<9,1>
ReuseFactor: 144
Trace: true
conv2d_2:
Precision:
bias: ap_fixed<9,1>
weight: ap_fixed<9,1>
ReuseFactor: 2304
Trace: true
conv2d_3:
Precision:
bias: ap_fixed<9,1>
weight: ap_fixed<9,1>
ReuseFactor: 2304
Trace: true
dense:
Precision:
bias: ap_fixed<9,1>
weight: ap_fixed<9,1>
ReuseFactor: 16
Trace: true
in_layer:
Precision:
result: ap_fixed<16,6>
Trace: true
max_pooling2d:
Precision: ap_fixed<16,6>
Trace: true
max_pooling2d_1:
Precision: ap_fixed<16,6>
Trace: true
max_pooling2d_2:
Precision: ap_fixed<16,6>
Trace: true
reshape:
Precision: ap_fixed<16,6>
Trace: true
reshape_1:
Precision: ap_fixed<16,6>
Trace: true
Model:
Precision: ap_fixed<16,6>
ReuseFactor: 1
Strategy: Resource
IOType: io_stream
KerasModel: !keras_model 'hls4ml_prj/keras_model.h5'
OutputDir: hls4ml_prj
ProjectName: myproject
Stamp: FbA5d5E4
XilinxPart: xcu250-figd2104-2L-e
I also managed to synthesis the HLS model, but I want to validate it to be sure that it works.
Furthermore, if I try to change manually the output shape in the C code I get segmentation fault (core dumped)
.
Has anyone faced a similar issue? I suppose that HLS4ML is interpreting the model in the wrong way, but I am not sure.
Thank you in advance, I hope someone can help me.