VanillaNet icon indicating copy to clipboard operation
VanillaNet copied to clipboard

Results 23 VanillaNet issues
Sort by recently updated
recently updated
newest added

Hi, thanks for the great work. I have tried to apply vanilla-9 to object detection, however, when transferring the model to TensorRT, it seems much slower than ResNet-34. Is there...

老师您好,我使用net.load_state_dict(torch.load(model_weight_path, map_location='cpu'))加载模型测试时,出现RuntimeError: PytorchStreamReader failed reading zip archive: failed finding central directory错误,我的路径没问题,20几个权重都是这个错误,请问是什么原因呢?期待您的解答,我是一名研三学生,在应用您的模型时出现了这个问题,期望您百忙之中抽时间为我解答疑惑,万分感谢!

# We use leakyrelu to implement the deep training technique. x = torch.nn.functional.leaky_relu(x,self.act_learn) 这行代码,self.act_learn=1, 相当于不做激活, 它的作用仅仅是将网络变深吗?

Hello, I want to know how to draw the Accuracy vs.depth diagram in your paper ? Look forward to reply !

Hi, I evaluated the speed in training proceeds between VanillaNet and ResNet, vanillanet has a little bit less params and flops than resnet. However, vanillanet is slower than resnet and...

Hi, that's a mind-blowing work, but I believe there is a slight miscommunication in the paper & codebase regarding the number of Conv layers and their influence on the inference...

norm_cfg = dict(type='BN', requires_grad=True) model = dict( type='EncoderDecoder', backbone=dict( _delete_=True, type='Vanillanet', act_num=3, dims=[512, 512, 1024, 2048, 2048, 2048, 2048, 2048, 2048, 4096, 4096], strides=[1, 2, 2, 1, 1, 1, 1,...

Is there any comparasion or comment with MobileOne? https://github.com/apple/ml-mobileone

Hi, thanks for the great work. Can I understand the class 'activation(nn.ReLU)' as a combination of ReLU->depth Conv-> BN? I don't seem to see concurrently stacking activation functions in the...

Thanks for publishing this quite interesting work. I believe it is customary to use either bias or BatchNorm, but not both at once. the nn.Conv2d layers not have bias=False added...