pytorch-summary icon indicating copy to clipboard operation
pytorch-summary copied to clipboard

Bug if number of parameters is zero.

Open levifussell opened this issue 3 years ago • 0 comments

Hi,

If the number of parameters of the model is zero, then the call for (line 102):

total_params_size = abs(total_params.numpy() * 4. / (1024 ** 2.))

will fail because the type is an integer, not a torch tensor. This may seem like a weird case (when are there no parameters), but I came across this after writing 'wrapper' Modules for modules with parameters. These wrappers had no parameters themselves.

The fix is just to check the parameter count is non-zero, otherwise remove the 'numpy()' call.

levifussell avatar Mar 16 '22 02:03 levifussell