TorchSharp
TorchSharp copied to clipboard
Autocast
Soon i will try make AMP (Automatic Mixed Precision) with GradScaler.
@dotnet-policy-service agree
@haytham2597 -- thank you for your first PR! Much appreciated. Please see the comment I made in the review.
Do not merge, i keep have some issue.
Lots of errors in the build on everything except the .NET FX builds (which don't have System.Range):
https://dev.azure.com/dotnet/TorchSharp/_build/results?buildId=103093&view=logs&j=80b813b5-9a08-5859-11a8-dc0e5b556e52&t=d3977768-5d05-5555-eccf-169680cb7093
I am very happy to see this proposal.
@haytham2597 -- just a gentle ping! I think this PR would be very valuable, but it's still a draft, and thus I will not merge it. I also had some comments in my review.
@haytham2597 -- just a gentle ping! I think this PR would be very valuable, but it's still a draft, and thus I will not merge it. I also had some comments in my review.
Yeah, but sorry i am very busy with studied and work. I need managed very well about my time for making some progress on this pull requests, i mean this is very useful for me too. But i can provide some idea about this if you want continue.
- While the autocast is inside on scope automatically convert the tensor to dtype of autocast. For example
torch.Tensor a;
using(var ac = torch.NewAutocast()){
torch.Tensor b = a;
torch.Tensor c = torch.arange(...)
}
The b
and c
should automatically converted to float16 (if that is dtype of mixed precision from f32) including all weight/bias of modules that found inside i mean the module, example: ResNet should passed to mixed precision.
The idea Is very similar that you do with
using (var d = torch.NewDisposeScope())
And in outer scope need back to original dtype. Because the neural should backward with original dtype (on my understood) With my external THS_Autocast u can determine the dtype that should passed/work and if is enabled/disabled too I don't know if I explained myself correctly, but feel free to ask.
Yeah, no pressure!
We all have other things to do, so I understand completely. Just wanted to let you know we haven't forgotten about your work, and that it will be appreciated, if and when you find time.