DeepShift
DeepShift copied to clipboard
Round to Fixed to Deal with Unsigned Tensors
We need to deal with unsigned tensors where there is no sign bit which is the case with activations to convolution that are usually the output of a Relu layer. This will save us one bit when we want to quantize activations to lower bitwidths.