hls4ml
hls4ml copied to clipboard
LeakyReLU for pytorch to HLS code just returning a identity operation
Hello; I just wanted to notify that I noticed that HLS version of torch.nn.LeakyReLU() doesn't actually mutiply the negative input by 0.01, like it is done normally in pytorch. The resulting output turns out to be just a identity operation.
Since I use a lot of leaky relu instead of relu, and I imagine a lot of others do too, I thought it would be important to bring this up.
Thanks in advance.