patient-knowledge-distillation topic
List
patient-knowledge-distillation repositories
PKD-for-BERT-Model-Compression
194
Stars
45
Forks
Watchers
pytorch implementation for Patient Knowledge Distillation for BERT Model Compression