semantic-segmentation-pytorch icon indicating copy to clipboard operation
semantic-segmentation-pytorch copied to clipboard

Softmax and log-softmax no longer applied in models.

Open Britefury opened this issue 5 years ago • 0 comments

Softmax and log-softmax no longer applied in models; they are now applied in the evaluation and training scripts. This was done by using nn.CrossEntropyLoss rather than nn.NLLLoss.

Class predictions are still valid when using max/argmax of logits rather than probabilities, so we can use logits for evaluation accuracy and IoU.

Furthermore I've change the decoders so that rather than using the use_softmax flag to determine if we are in inference mode, we apply the interpolation if the segSize parameter is provided; only done in inference in your code. Also, the decoders now return a dict with the 'logits' key giving the predicted logits and the 'deepsup_logits' key giving logits for deep supervision, when using deep supervision decoders.

The motivation for this is that some uses of semantic segmentation models require losses other than softmax/log-softmax as used in supervised training. Moving this out of the model classes make them useful in a wider variety of circumstances. Specifically I want to test a PSPNet in my semi-supervised work here: https://github.com/Britefury/cutmix-semisup-seg. I use a variety of unsupervised loss functions, hence preferring that models output logits that can be processed in a variety of ways.

Britefury avatar Jul 21 '20 09:07 Britefury