s4
s4 copied to clipboard
Does current codebase support fp16/bf16?
I attempted to execute the code while enabling bf16 but came across some error messages. Furthermore, I conducted a search for the keywords "bf16" or "fp16" within this repository but found no results. Could this imply that this codebase does not presently support low-precision training?
I've used mixed precision (fp16) in the past through Pytorch Lightning's automatic features. It's as simple as passing in a flag to the Trainer: https://lightning.ai/docs/pytorch/stable/common/precision_intermediate.html