moose icon indicating copy to clipboard operation
moose copied to clipboard

Probabilistic rounding when encoding fixed-points

Open mortendahl opened this issue 3 years ago • 2 comments

Using probabilistic rounding instead of flooring when encoding fixed-points seems to have some benefits from the ML perspective re accuracy. This is noted in for instance KS'21.

This issue is about investigating these claims and potentially updating the code base to use probabilistic rounding.

mortendahl avatar Sep 07 '21 11:09 mortendahl

@yanndupis @jvmncs @ekloberdanz maybe something for you to look into during the current focus on accuracy?

mortendahl avatar Mar 04 '22 11:03 mortendahl

@mortendahl @yanndupis @jvmncs That is a good idea. Stochastic rounding has been leveraged for post training quantization, where we quantize neural network weight and or activations from f32 to int8 or lower to decrease memory usage and latency during inference. This paper proposed adaptive rounding: AdaRound And here is a comprehensive pre-print on stochastic rounding, which is great not only for fixed point computations, but also for low bit width float computations: Stochastic Rounding: Implementation, Error Analysis, and Applications. It was recommended to me by one of the co-authors, Nick Higham, he is one of the top experts on numerical analysis.

ekloberdanz avatar Mar 04 '22 14:03 ekloberdanz