quantized.pytorch icon indicating copy to clipboard operation
quantized.pytorch copied to clipboard

Bug in UniformQuantize class

Open michaelklachko opened this issue 6 years ago • 0 comments

Hi, thank you for posting your code!

I think there's a mismatch of the argument order between here and here.

def forward(cls, ctx, input, num_bits=8, min_value=None, max_value=None, stochastic=False, inplace=False, enforce_true_zero=False, num_chunks=None, out_half=False)
UniformQuantize().apply(  x, num_bits,   min_value,      max_value,      num_chunks,       stochastic,    inplace)

Among other potential issues, this causes stochastic arg to take the value of num_chunks, sometimes making it true, and leading to "stochastic" rounding.

michaelklachko avatar Sep 30 '18 23:09 michaelklachko