nncf
nncf copied to clipboard
how to export torch.fx converted int8 model to onnx?
Hi, since pytorch official released an easy and controlable quantization workflow with torch.fx
However, seems there still a gap from torch supported and user wanted. Just wonder, if nncf would support export fx converted int8 model to onnx? So that we can make full use of pytorch and nncf for deployment.
Hi, sorry for the late response. There is currently no roadmap for torch.fx support. We will let you know if there are any changes. Thank you!
@vshampor , FYI, it's related to Torch.FX
Ref. 138686