pytorch-summary icon indicating copy to clipboard operation
pytorch-summary copied to clipboard

Model summary in PyTorch similar to `model.summary()` in Keras

Results 118 pytorch-summary issues
Sort by recently updated
recently updated
newest added

Add support for a net outputting a list. Say you are returning intermediary feature-maps from a U-net. These feature-maps will have a different size and cannot be stacked into one...

The params of layers with `requires_grad=False` are counted as trainable parameters.

platform: win10 version: 1.6.0 # net.py ```python import time import torch import torch.nn as nn import torchvision.models._utils as _utils import torchvision.models as models import torch.nn.functional as F from torch.autograd import...

## Update report 1. Fix the bug of parameter number calculation when there are more than one output variables, including both sequence case and dict case (mentioned in #162). 2....

Here is a simple code giving a different number of parameters between pytorch and torchsummay. It seems that torchsummay does not count `torch.nn.parameter` layers. ``` import torch import torchsummary class...

When I try to run the summary for a non-convolutional autoencoder architecture: ``` import torch.nn as nn import torch from torch.autograd import Variable import sys from torchsummary import summary class...

An overflow occurred when I ran the following code. This is why the model estimation, including batch size, is not successful. ``` python import torch from torchvision import models from...

Hi, How can I save the model summary output to variable? text file? print to file?

If relu layer is implemented by `torch.nn.Relu(inplace=True)` , it doesn't cost any additional memory when forward pass. But it is still summarized

Dear torchsummary developer(s), I really appreciate the function of your model but I found a kinda big thing/bug/feature today that fucked up a lot of my development (and computation time)....