pytorch_modelsize icon indicating copy to clipboard operation
pytorch_modelsize copied to clipboard

Fails with nn.ModuleList

Open alex-golts opened this issue 4 years ago • 1 comments

I have a model defined with a sequence of layers stored in a nn.ModuleList. For example I find it useful for defining a fully connected neural network (MLP) where you want to pass the number of layers and neurons per layer as input to the constructor, like this:

class Net(nn.Module):
    
    def __init__(self, inputSize, numLayers, nodesPerLayer, activationType):
        super(Net, self).__init__()
        if activationType.lower() == 'sigmoid':
            self.activation = nn.Sigmoid()
        elif activationType.lower() == 'tanh':
            self.activation = nn.Tanh()
        else:
            self.activation = nn.ReLU()
        self.hidden = nn.ModuleList()
        self.hidden.append(nn.Linear(inputSize, nodesPerLayer))
        for i in range(numLayers-1):
            self.hidden.append(nn.Linear(nodesPerLayer, nodesPerLayer))
        self.finalFC = nn.Linear(nodesPerLayer, 1)


    def forward(self, x):
        for layer in self.hidden:
            x = self.activation(layer(x))
        x = self.finalFC(x)
        return x

In this case, calling self.model.modules() returns the list of nn.Linear layers but also the nn.ModuleList as a separate module. Then the get_output_sizes function tried to run an input through this "module" and fails.

Additionally I think the get_parameter_sizes function calculates the parameter size incorrectly because it takes into account each layer twice.

I think for this case this issue can be solved by checking in both get_parameter_sizes and get_output_sizes functions whether a module is of type nn.ModuleList (using isinstance(m, nn.ModuleList), then do nothing.

What do you think?

alex-golts avatar Nov 10 '20 07:11 alex-golts

Why not just iterate over model.parameters()?

khasmamad99 avatar Mar 30 '23 16:03 khasmamad99