automatic-mixed-precision-tutorials-pytorch icon indicating copy to clipboard operation
automatic-mixed-precision-tutorials-pytorch copied to clipboard

Memory usage was not decreased when using amp

Open joshhu opened this issue 4 years ago • 1 comments

Hi,

I was trying your code couple of days ago. It turned out when i used the '--amp' parameter, the memory usage was not decreased (around 10G of of my 2080Ti 11G). The training time was also almost the same without amp.

How did you get the results?

Thank you.

joshhu avatar Sep 11 '20 16:09 joshhu

Hi, Thanks for opening the Issue. I have had a similar experience with Video Recognition. Memory usage may decrease when an operation optimized for the AMP function is used, but otherwise, the effect may not be seen.

For detailed cause analysis, it seems to be necessary to check the code and architecture used in the experiment. Thanks!

hoya012 avatar Sep 14 '20 01:09 hoya012