Yoshitomo Matsubara
Yoshitomo Matsubara
torchdistill
A coding-free framework built on PyTorch for reproducible deep learning studies. 🏆25 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. 🎁 Traine...
head-network-distillation
[IEEE Access] "Head Network Distillation: Splitting Distilled Deep Neural Networks for Resource-constrained Edge Computing Systems" and [ACM MobiCom HotEdgeVideo 2019] "Distilled Split Deep Neural Net...
hnd-ghnd-object-detectors
[ICPR 2020] "Neural Compression and Filtering for Edge-assisted Real-time Object Detection in Challenged Networks" and [ACM MobiCom EMDL 2020] "Split Computing for Complex Object Detectors: Challenges...
supervised-compression
[WACV 2022] "Supervised Compression for Resource-Constrained Edge Computing Systems"
sc2-benchmark
[TMLR] "SC2 Benchmark: Supervised Compression for Split Computing"