BitNet icon indicating copy to clipboard operation
BitNet copied to clipboard

1.58bit algorithm implement recommend

Open princepride opened this issue 1 year ago • 1 comments

https://github.com/kyegomez/BitNet/blob/914bad9ba188dfc32e34a0a0a9ee042d7962e604/bitnet/bitbnet_b158.py#L52

I noticed that you're attempting to implement 1.58-bit quantization, but it seems you only quantize the values during the forward pass, then proceed with the model inference, using the original values for the backward pass. In 4-bit quantization, we store two quantized values in one byte for representation, and the computation and gradients of the new data type are implemented with CUDA. You should consider this approach as well. Keep it up, I'm rooting for you.

princepride avatar Mar 15 '24 07:03 princepride

I think what youre saying is that youre saying is that they should use the 1.58 bit quantization for the backward pass? Its not really talked about in the paper for whatever reason, but the 1.58bit quantization destroys the gradient, making backprop with it impossible, so they keep the original weights to do backprop, while using the quantized weights for forward pass.

AwaitFuture avatar Mar 22 '24 07:03 AwaitFuture

Stale issue message

github-actions[bot] avatar May 21 '24 12:05 github-actions[bot]