knowledge-distillation-pytorch
knowledge-distillation-pytorch copied to clipboard
A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibility
I have downloaded the .zip file from boxed folder, but it can't be unzipped successfully, after being unzipped, the .tar file has become .tar.cpgz file. I also have tried to...
Did you use the fitnets for kd the model? Fitnets: Hints for thin deep nets
This is my situation. I trained base_cnn in advance using cifar10 dataset for comparing performance between base_cnn and cnn_distill. Also, I trained base_resnet18 as a teacher using same dataset. Lastly,...
请问我在服务器上如何通过linux命令下载 box文件夹中的数据?
In the code, the dataloader 'shuffle' switch is set to True. So the teacher output can not actually work.
I print the first 32 labels of train dataloader for teacher net and got: `14, 8, 29, 67, 59, 49, 73, 25, 4, 76, 11, 25, 82, 6, 11, 47,...
https://github.com/peterliht/knowledge-distillation-pytorch/blob/master/experiments/base_cnn/train.log ``` 2018-03-09 20:46:06,587:INFO: Loading the datasets... 2018-03-09 20:46:10,074:INFO: - done. 2018-03-09 20:46:10,078:INFO: Starting training for 30 epoch(s) 2018-03-09 20:51:27,485:INFO: Loading the datasets... 2018-03-09 20:51:30,918:INFO: - done. 2018-03-09 20:51:30,922:INFO: Starting...
I modified the code, and I get an error, does anybody have any idea why? I am using CPU: I have an error in this line: —> 10 output_teacher_batch =...
why? from pytorch_lighting import as pl it showes : no module named torch._dynamo. how to solve it?thx.