andrewor14

Results 23 comments of andrewor14

@pytorchbot rebase

@pytorchbot rebase

@pytorchbot merge

> @pytorchbot revert -m "broke ROCm, PR signal was clean but trunk was not, the merge should have been blocked but wasn't" -c weird Hi @jeffdaily, could you point me...

@pytorchbot rebase

Hi @alexsamardzic, thanks for working on this. Just wanted to clarify, will this kernel support int4 grouped per channel weight quantization + int8 per token dynamic activation quantization?

Hi @alugorey, thanks for fixing this. It seems there are two separate issues, one is forward AD and the other is the contiguous tensor assertion. Is it possible to separate...

> @andrewor14 Actually, on closer inspection the reason i wrapped up both those issues into one is because they both fell under the umbrella of the same skip decorator in...

My concern is that the forward AD changes here will just become dead code in the future, since it's not aligned with our long term plans of consolidating batch norm...