Yoshitomo Matsubara

Results 5 repositories owned by Yoshitomo Matsubara

torchdistill

1.3k
Stars
124
Forks
Watchers

A coding-free framework built on PyTorch for reproducible deep learning studies. 🏆25 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. 🎁 Traine...

head-network-distillation

30
Stars
5
Forks
Watchers

[IEEE Access] "Head Network Distillation: Splitting Distilled Deep Neural Networks for Resource-constrained Edge Computing Systems" and [ACM MobiCom HotEdgeVideo 2019] "Distilled Split Deep Neural Net...

hnd-ghnd-object-detectors

25
Stars
4
Forks
Watchers

[ICPR 2020] "Neural Compression and Filtering for Edge-assisted Real-time Object Detection in Challenged Networks" and [ACM MobiCom EMDL 2020] "Split Computing for Complex Object Detectors: Challenges...

supervised-compression

28
Stars
2
Forks
Watchers

[WACV 2022] "Supervised Compression for Resource-Constrained Edge Computing Systems"

sc2-benchmark

23
Stars
7
Forks
Watchers

[TMLR] "SC2 Benchmark: Supervised Compression for Split Computing"