DynamicReLU icon indicating copy to clipboard operation
DynamicReLU copied to clipboard

Implementation of Dynamic ReLU on Pytorch

Results 7 DynamicReLU issues
Sort by recently updated
recently updated
newest added

Hello, can I ask how to operate with conv_type='3d'? about the code 'theta = torch.mean(theta, axis=-1)' when conv_type = '2d'? Do somebody know?Thanks a lot.

Excuse me,could you please share the code of DynamicReluC,which is mentioned in the paper?

Excuse me,could you please share the code of DynamicReluC,which is mentioned in the paper?

def get_relu_coefs(self, x): print(x.shape) # axis? theta = torch.mean(x, dim=-1) if self.conv_type == '2d': # axis? theta = torch.mean(theta, dim=-1) theta = self.fc1(theta) theta = self.relu(theta) theta = self.fc2(theta) theta...

Thanks for your working! I have some questions: 1. That reduction ratios R=8 is the best trade-off in the paper , but the code defaults to 4, so why? 2....

Hi, have you ever reimplemented the results as the paper reported, such as MobileNet or ResNet?