resuneta
resuneta copied to clipboard
resunet-a blcok
def hybrid_forward(self,F,_input_layer):
**x = self.BN1(_input_layer)
x = F.relu(x)**
x = self.conv1(x)
x = self.BN2(x)
x = F.relu(x)
x = self.conv2(x)
return x
accroding to the network, it seems that this part(the first Bn and ReLu layer) was calculated reapted? I mean it just need be calculated once in the head of every resunet-a block.
Hi @XavierMFC thank you for your interest in our work. I don't understand what you mean? Unless I am missing something, the calculation is necessary, because if is not repeated in subsequent lines (x is overwritten with new values), or even if you stuck multiple blocks together.