hls4ml icon indicating copy to clipboard operation
hls4ml copied to clipboard

Leaky Quantized ReLU fix

Open DrWatt opened this issue 1 year ago • 1 comments

Description

The quantized_relu activation layer included in the QKeras library lets the user set the negative_slope option which practically change the activation function from the usual ReLU to a LeakyReLU. This change was not perceived by hls4ml, meaning that that information was lost when the HLS_model was created by the library. In order to fix this behaviour I have added the QLeakyReLU in the list of supported layers in model/layers.py using the ParametrizedActivation implementation mimicking the LeakyReLU already present and so following a similar path of implementation to the "non-leaky" quantized_relu. Other changes were made to model/profiling.py and utils/config.py to make them compatible with the new layer.

A couple of other fixes were made in backends/vivado_accelerator/vivado_accelerator_config.py (a missing "casting" function when comparing complex objects with strings) and in model/types.py (adding the ap_ to the hls fixed precision type due to errors raised by the vivado compiler)

Type of change

  • [x] Bug fix (non-breaking change that fixes an issue)

Tests

The quantized_relu with a negative_slope different from the default one was added in the pytest routine covering qkeras' layers. A specific test script has been also added to the pytest directory. The results from the new implementation are asserted to be equal to the QKeras layer with a relative tolerance of 0.00001.

Checklist

  • [x] I have read the guidelines for contributing.
  • [x] I have commented my code, particularly in hard-to-understand areas.
  • [x] I have made corresponding changes to the documentation.
  • [x] My changes generate no new warnings.
  • [x] I have installed and run pre-commit on the files I edited or added.
  • [x] I have added tests that prove my fix is effective or that my feature works.

DrWatt avatar Jan 29 '24 12:01 DrWatt

Sorry for the delay, but what is the status of this? It looks to me like maybe I can approve and merge it.

jmitrevs avatar Mar 08 '24 23:03 jmitrevs

I was testing this and the changes work, however the PR includes unnecessary changes and the test doesn't do anything (data is all positive, so it passes even without the changes in the converter). I distilled the changes and made a proper test in https://github.com/vloncar/hls4ml/tree/qrelu_negative_slope

vloncar avatar Mar 28 '24 08:03 vloncar

I made #987 as a continuation of this PR

vloncar avatar Mar 28 '24 20:03 vloncar