ParNet icon indicating copy to clipboard operation
ParNet copied to clipboard

Really faster than ResNet? I am very confused

Open StonepageVan opened this issue 3 years ago • 2 comments

Hello, my friend, appreciate for your great work! I have tested your code and change the ResNet code in my model by using your ParNet , but the actual time is quite slow than the paper said. My block size is [64, 128, 256, 512, 2048], and the time of "forward()" is more than 5s average while the Resnet is 0.02s in my device. I have use the time function for every line in the forward(), find that the encode stuff is the main reason. I continue write time.perf_counter() in the encode stuff, find that the "self.stream2_fusion" and "self.stream3_fusion" is the most time user. Do you know why ?

StonepageVan avatar Dec 19 '21 11:12 StonepageVan

Hello I tried with a custom dataset and the accuracy was not good. But I didn't see any performance issues while training. I will check and share here.

Pritam-N avatar Dec 20 '21 08:12 Pritam-N

This due to mistakes in implementation. In paper mentioned that authors borrowed much code from RepVGG. At train time i think it is much more than vanilla ResNet but after layer fusion (1x1conv + all bns to one 3x3 conv) this network can achieve better performance. Fusion and Downsample also will be fused using Rep stile, check their repo for similar (Inception like) blocks and fusion of them

cszer avatar Dec 22 '21 17:12 cszer