vnet.pytorch icon indicating copy to clipboard operation
vnet.pytorch copied to clipboard

ContBatchNorm3d

Open EricKani opened this issue 6 years ago • 7 comments

Hi, I can't run through your ContBatchNorm2d code. It returns to me an error about "AttributeError: 'super' object has no attribute '_check_input_dim'". I want to know why you write code like this, what if I use F.barchnorm3d directly? Thank you very much.

EricKani avatar Jun 06 '18 08:06 EricKani

Hi, I can't run through your ContBatchNorm2d code. It returns to me an error about "AttributeError: 'super' object has no attribute '_check_input_dim'". I want to know why you write code like this, what if I use F.barchnorm3d directly? Thank you very much.

I also encountered this problem today, did you solve the problem now?

qianqianCDQ avatar Apr 03 '20 12:04 qianqianCDQ

You can substitute the following in:

class ContBatchNorm3d(nn.modules.batchnorm._BatchNorm):
    def _check_input_dim(self, input):
        if input.dim() != 5:
            raise ValueError('expected 5D input (got {}D input)'
                             .format(input.dim()))
        #super(ContBatchNorm3d, self)._check_input_dim(input)

    def forward(self, input):
        self._check_input_dim(input)
        return F.batch_norm(
            input, self.running_mean, self.running_var, self.weight, self.bias,
            True, self.momentum, self.eps)

jphdotam avatar May 31 '20 13:05 jphdotam

@jphdotam Thank you for the fix. I can confirm this works. Weird how this issue has been open for 2.5 years and has not been committed to the repo.

rayryeng avatar Jan 28 '21 07:01 rayryeng

Dear @rayryeng I went on to find this network was very poor and could barely fit. I ended up rolling my own 3D Unet which worked MUCH better with fewer parameters: https://github.com/jphdotam/Unet3D

jphdotam avatar Jan 28 '21 14:01 jphdotam

@jphdotam thanks! I'll take a look after I finish with some experiments. I'm currently training something right now and will also take a look at yours.

Edit: oh wow fresh off the press! Thanks for sharing it with the community!

rayryeng avatar Jan 28 '21 14:01 rayryeng

@jphdotam Why make the batch_norm always the training mode in ContBatchNorm3d?

chaoyan1037 avatar Jul 09 '21 00:07 chaoyan1037

Hi,

I had the same issue. The problem is the super class of ContBatchNorm3d is _BatchNorm, but _BatchNorm has not a _check_input_dim method. I don't think that commenting the code is the right fix. IMHO it should be a change in the inheritance to BatchNorm3d and commenting the double check like this:

class ContBatchNorm3d(nn.modules.batchnorm.BatchNorm3d):
    def _check_input_dim(self, input):
        #if input.dim() != 5:
        #    raise ValueError('expected 5D input (got {}D input)'
        #                     .format(input.dim()))
        super()._check_input_dim(input)

    def forward(self, input):
        self._check_input_dim(input)
        return F.batch_norm(
            input, self.running_mean, self.running_var, self.weight, self.bias,
            True, self.momentum, self.eps)

bsolano avatar Oct 11 '23 18:10 bsolano