cryodrgn
cryodrgn copied to clipboard
torch.cuda.amp vs apex.amp
Describe the bug It looks like we're using a mix of torch.cuda.amp and apex.amp ... can we rationalize this?
Expected behavior Maybe use torch.cuda.amp everywhere.
Use of apex.amp
is a historic relic from the days before pytorch natively supported amp (version 1.6+ iirc). I kept it in after adding torch.cuda.amp
support to maintain backwards compatibility.
Maybe now is a good time to switch to torch.cuda.amp
everywhere -- (i.e. before our next release, which is shaping up to be a major one)
@vineetbansal
@zhonge I think that keeping backwards compatibility would be cool, but I that will work because you explicitly use torch.cuda.amp
in some places. If you wanted to do that, you could do something like
try:
import amp from torch.cuda
except ImportError:
try:
import amp from apex
except ImportError:
log("Couldn't find amp. Will not run in half precision")