Knowledge-Distillation-Toolkit
Knowledge-Distillation-Toolkit copied to clipboard
:no_entry: [DEPRECATED] A knowledge distillation toolkit based on PyTorch and PyTorch Lightning.
Hello guys, I'm start learn this approach, I try some approach for install wav2letter (e.g. install flashlight using vcpkg), but unfortunately when the installation is done this command `from wav2letter.decoder...
hi there. i want to quantize a wav2vec model by your script but i got a lot of error when i want to run it. how can run it? i...
KD_resnet.start_kd_training() LOCAL_RANK: 0 - CUDA_VISIBLE_DEVICES: [0] | Name | Type | Params ----------------------------------------------- 0 | student_model | StudentModel | 11.2 M 1 | teacher_model | TeacherModel | 21.3 M -----------------------------------------------...
I have observed that Dataloader is slower than a normal for loop. I think it has something to do with the preprocessing in collate function. Can you please check?
Hi Georgian, Could you please share details on how did you do installation of wav2letter package, I tried following the instructions specified in the original repository, but still facing an...
I want to directly load the quantized model. So how to save the wav2vec model properly.