quantized_distillation
quantized_distillation copied to clipboard
Implements quantized distillation. Code for our paper "Model compression via distillation and quantization"
Hi! I ran the cifar10_test.py. But following successful training of the teacher network, it gives the following error. I have made TRAIN_DISTILLED_MODEL =True but this does not fix the issue....
Bumps [torch](https://github.com/pytorch/pytorch) from 0.3.1 to 2.2.0. Release notes Sourced from torch's releases. PyTorch 2.2: FlashAttention-v2, AOTInductor PyTorch 2.2 Release Notes Highlights Backwards Incompatible Changes Deprecations New Features Improvements Bug fixes...