Bidirectional RNN layer support for Keras frontend and Vitis backend
Description
This PR adds support for Bidirectional RNN layers using Keras with the Vitis backend in io_parallel mode. The forward and backward layer can be either LSTM or GRU, and their architecture and independent one from the other.
Type of change
- [x] New feature
Tests
Unit test in test/pytest/test_rnn.py was updated to also check parsing and accuracy for a Bidirectional layer.
Test Configuration:
The new tests are carried out using only Vivado or Vitis backend and io_parallel mode.
Checklist
- [x] I have read the guidelines for contributing.
- [x] I have commented my code, particularly in hard-to-understand areas.
- [ ] I have made corresponding changes to the documentation.
- [x] My changes generate no new warnings.
- [x] I have installed and run
pre-commiton the files I edited or added. - [x] I have added tests that prove my fix is effective or that my feature works.
Generally this looks good to me, comments are minor. I'll wait until some things are merged that should fix some tests failures and then run the CI.
Hi, thank you for implementing this, have you tried this with kerasv3? the mentioned test unit is using keras2 only It seems to fall to the keras v2 handler, but I get the following error.
v2 handler used for layer bidirectional
Traceback (most recent call last):
File "/work/NGT/ngt2.2-toy-simulation/./convert/test_convert.py", line 180, in <module>
hls_model = converttools.conv_to_hls(models[mod_id], model,REWRITE_CONF=args.rewriteconf, verbose=True)
File "/work/NGT/ngt2.2-toy-simulation/convert/../convert/converttools.py", line 211, in conv_to_hls
hls_model = hls4ml.converters.convert_from_keras_model(
File "/work/NGT/hls4ml_enlupi/hls4ml/hls4ml/utils/dependency.py", line 46, in inner
return f(*args, **kwargs)
File "/work/NGT/hls4ml_enlupi/hls4ml/hls4ml/converters/__init__.py", line 223, in convert_from_keras_model
return keras_v3_to_hls(config)
File "/work/NGT/hls4ml_enlupi/hls4ml/hls4ml/converters/keras_v3_to_hls.py", line 294, in keras_v3_to_hls
return ModelGraph.from_layer_list(config, layer_list, input_layers, output_layers)
File "/work/NGT/hls4ml_enlupi/hls4ml/hls4ml/model/graph.py", line 443, in from_layer_list
model._make_graph(layer_list)
File "/work/NGT/hls4ml_enlupi/hls4ml/hls4ml/model/graph.py", line 477, in _make_graph
self.graph[name] = self.make_node(kind, name, layer, inputs, outputs)
File "/work/NGT/hls4ml_enlupi/hls4ml/hls4ml/model/graph.py", line 566, in make_node
node = layer_cls(self, name, attributes, inputs, outputs, initialize)
File "/work/NGT/hls4ml_enlupi/hls4ml/hls4ml/model/layers.py", line 122, in __init__
self.initialize()
File "/work/NGT/hls4ml_enlupi/hls4ml/hls4ml/model/layers.py", line 1530, in initialize
self.add_weights_variable(name=f'{dir}_weight', var_name=(f'w_{dir[0]}_' + '{index}'))
File "/work/NGT/hls4ml_enlupi/hls4ml/hls4ml/model/layers.py", line 337, in add_weights_variable
var = WeightVariable(
File "/work/NGT/hls4ml_enlupi/hls4ml/hls4ml/model/types.py", line 562, in __init__
self.shape = list(self.data.shape)
AttributeError: 'NoneType' object has no attribute 'shape'
I now added support for Keras V3, creating a custom parser for the Bidirectional layer and fixing some unintended behavior when calling the v2 handlers for the LSTM and GRU layers. Now the test unit works fine for me both with keras v2 and v3. Please let me know if you still experience some issues.
Test failures unrelated, this is ready for merge.