whyu

Results 2 comments of whyu

> i changed it in lines 120-121 : > > ``` > # Compute loss at each stage > loss_char = torch.sum(torch.stack([criterion_char(restored[j], target) for j in range(len(restored))])) > loss_edge =...

在benchmark的main.py里面,不知道为什么,我写的卷积模型剪枝之后FLOPs几乎不变,而且明明参数只有150W,但纸面的FLOPs大小却比VGG还大