Expand AOTI tutorial to cover AMP
📚 The doc issue
I think it would useful to expand AOTI tutorial to cover AMP trained models.
https://pytorch.org/tutorials/recipes/torch_export_aoti_python.html
Suggest a potential alternative/fix
No response
cc @svekars @brycebortree @sekyondaMeta @ezyang @chauhang @penguinwu @avikchaudhuri @gmagogsfm @zhxchen17 @tugsbayasgalan @angelayi @suo @ydwu4 @desertfire @chenyang78
/cc @agunapal @angelayi @svekars
@bhack Thanks for the suggestion. Will check
Let's move this to the tutorials repo.
What is the status of this with 2.6.0 ?
@bhack what sort of model are you looking at? Could you provide a pointer? I would think that an AMP trained model behaves the same as just any model that you pass to export?
Nothing specific just a general guide how to export/aoti compile an amp model. Mainly not only specifically amp trained but also the case when we want to use amp at inference.
E.g. do we need to amp wrap the model before export/compile? Etc..