pytorch-cnn-finetune icon indicating copy to clipboard operation
pytorch-cnn-finetune copied to clipboard

Auto Flattening of features

Open Geeks-Sid opened this issue 5 years ago • 1 comments

Hi,

I found this annoying feature/bug as one may like to:

def forward(self, x):
    x = self.features(x)
    if self.pool is not None:
        x = self.pool(x)
    if self.dropout is not None:
        x = self.dropout(x)
    if self.flatten_features_output:
        x = x.view(x.size(0), -1)
     x = self.classifier(x)
     return x

Why do you have self.flatten_features_output if it is always going to be true? There is no access to that variable. And once the pooling is set to None, the tensor is automatically flattened instead of giving access to the layer. Any plans of changing it? Or giving access?

Geeks-Sid avatar Jan 16 '20 17:01 Geeks-Sid

I just fixed it for myself and created a pull request.

Geeks-Sid avatar Jan 16 '20 17:01 Geeks-Sid