vnet.pytorch
vnet.pytorch copied to clipboard
ContBatchNorm3d
Hi, I can't run through your ContBatchNorm2d code. It returns to me an error about "AttributeError: 'super' object has no attribute '_check_input_dim'". I want to know why you write code like this, what if I use F.barchnorm3d directly? Thank you very much.
Hi, I can't run through your ContBatchNorm2d code. It returns to me an error about "AttributeError: 'super' object has no attribute '_check_input_dim'". I want to know why you write code like this, what if I use F.barchnorm3d directly? Thank you very much.
I also encountered this problem today, did you solve the problem now?
You can substitute the following in:
class ContBatchNorm3d(nn.modules.batchnorm._BatchNorm):
def _check_input_dim(self, input):
if input.dim() != 5:
raise ValueError('expected 5D input (got {}D input)'
.format(input.dim()))
#super(ContBatchNorm3d, self)._check_input_dim(input)
def forward(self, input):
self._check_input_dim(input)
return F.batch_norm(
input, self.running_mean, self.running_var, self.weight, self.bias,
True, self.momentum, self.eps)
@jphdotam Thank you for the fix. I can confirm this works. Weird how this issue has been open for 2.5 years and has not been committed to the repo.
Dear @rayryeng I went on to find this network was very poor and could barely fit. I ended up rolling my own 3D Unet which worked MUCH better with fewer parameters: https://github.com/jphdotam/Unet3D
@jphdotam thanks! I'll take a look after I finish with some experiments. I'm currently training something right now and will also take a look at yours.
Edit: oh wow fresh off the press! Thanks for sharing it with the community!
@jphdotam Why make the batch_norm always the training mode in ContBatchNorm3d?
Hi,
I had the same issue. The problem is the super class of ContBatchNorm3d is _BatchNorm, but _BatchNorm has not a _check_input_dim method. I don't think that commenting the code is the right fix. IMHO it should be a change in the inheritance to BatchNorm3d and commenting the double check like this:
class ContBatchNorm3d(nn.modules.batchnorm.BatchNorm3d):
def _check_input_dim(self, input):
#if input.dim() != 5:
# raise ValueError('expected 5D input (got {}D input)'
# .format(input.dim()))
super()._check_input_dim(input)
def forward(self, input):
self._check_input_dim(input)
return F.batch_norm(
input, self.running_mean, self.running_var, self.weight, self.bias,
True, self.momentum, self.eps)