brats17
brats17 copied to clipboard
Activation Layer Before Convolution layer in ResBlock
In file MSNet.py
while creating (or defining) the graph for the model, I noticed that in the function that creates a residual block the activation layer is being applied before convolution. Is that a mistake or I am missing something considering we don't apply activation before Convolution?
In file
MSNet.py
while creating (or defining) the graph for the model, I noticed that in the function that creates a residual block the activation layer is being applied before convolution
Have been scratching my head on this for sometime now, a 'Pre-Activation ResNet' might be the intention, as explained here:
https://towardsdatascience.com/resnet-with-identity-mapping-over-1000-layers-reached-image-classification-bb50a42af03e
If otherwise, I'd like to ask the same.