brevitas
brevitas copied to clipboard
How to set the precision of the back propagation gradient during quantified training in the brevitas library?
I want to use the brevitas library for quantization training, and I want to set the quantization precision of the gradient, what should I do? I only construct the forward network which can set quantization precision of every layers. When back propagation, I only use loss.backward() to do it. I can't see back propagation mechanism, please help me. Thanks.