Can't reproduce the results from the article
Hi! I'm a student who is very interested in your framework. Recently I tried to reproduce the results you described in the article "Up or Down? Adaptive Rounding for Post-Training Quantization" and got only 63.21% accuracy for MobileNet_V2. Couldn't you please help me with that? Which parameters did you use to achieve 69.78% accuracy on 4/32 config? I just changed the code in your example for PyTorch and replaced the ResNet18 model with MobileNet_V2 from torchvision.
Hi @Centient - Apologies for the delayed repsonse. Thank you for your interest. Could you please share the config you used to achieve 63.21% accuracy. Perhaps we could review that information and provide inputs to improve it further.
Hi @quic-ssiddego, thank you for your answer. Actually I just used your script for pytorch from the examples folder and changed the model from resnet18 to mobilenet_v2.
Hi! I'm a student who is very interested in your framework. Recently I tried to reproduce the results you described in the article "Up or Down? Adaptive Rounding for Post-Training Quantization" and got only 63.21% accuracy for MobileNet_V2. Couldn't you please help me with that? Which parameters did you use to achieve 69.78% accuracy on 4/32 config? I just changed the code in your example for PyTorch and replaced the ResNet18 model with MobileNet_V2 from torchvision.
How do you get this number? my case is even worse. I used tensorflow and the FP model accuracy is only at 0.22
Hi! To be honest, I don't quite remember right now. Are you sure, that you've installed everything exactly as they asked? There were moments, when I had to do some changes in the code, but it was during my own experiments, and the authors have probably fixed everything by this moment.
@Centient @PaulZhangIsing Models are made available on AIMET Model Zoo : https://github.com/quic/aimet-model-zoo. Could you please take a look at this for more information on : PyTorch : https://github.com/quic/aimet-model-zoo/tree/develop/zoo_torch/ and TensorFlow: https://github.com/quic/aimet-model-zoo/tree/develop/zoo_tensorflow/ and let me know if you have further queries. ( also note - the model used is not MobileNet_V2 from torchvision)
Hi,
I want to reproduce the results for torchvision-resnet18 from the paper "Up or Down? Adaptive Rounding for Post-Training Quantization". The configuration is 4-bit weights and 32-bit activations.
I am using the AdaRound algorithm, in the same manner as the example provided (https://github.com/quic/aimet/blob/develop/Examples/torch/quantization/adaround.ipynb). I modified it to use the ImageNet train set for calibration with batch size 32, 2048 samples, for 20K iterations (same as the paper).
However, the best accuracy I achieved is 66%, while you report 68.71% in the paper.
Could you please help me reproduce the results?