heat icon indicating copy to clipboard operation
heat copied to clipboard

Add float16 support to types

Open Markus-Goetz opened this issue 6 years ago • 5 comments

Markus-Goetz avatar Dec 05 '18 10:12 Markus-Goetz

PyTorch currently implements fp16 support for GPUs only. This issue will have to be resolved after HeAT has device support.

First implementation idea is to check the device the HeAT tensor shall be allocated on and not allow CPU backend.

Markus-Goetz avatar Feb 21 '19 10:02 Markus-Goetz

PyTorch has basic operations for float16 and bfloat16 by now. The function support, however, has as many holes as a Swiss cheese.

mtar avatar Feb 24 '21 11:02 mtar

The function support, however, has as many holes as a Swiss cheese.

Yes, I tried to add float16 but later many test were failed as torch.ceil, torch.floor and many more function were not implemented for float16.

shahpratham avatar Apr 12 '22 12:04 shahpratham

This issue is still open, and many torch functions still don't support Half data type.

Proposal:

  • introduce ht.float16 datatype
  • introduce a @SkipWithHalf (or similar) decorator for those tests that otherwise will fail
  • whenever a new pytorch version is out, we test without the decorator and update.

Reviewed within #1109

ClaudiaComito avatar Jul 31 '23 09:07 ClaudiaComito