Rajeev Goel
Results
2
comments of
Rajeev Goel
trafficstars
Use "torch.bfloat16" instead of "torch.float16" in AMP.
@neurosynapse @byronyi This issue is due to AMP using the torch.float16 dtype by default. Use torch.bfloat16 instead.