nntrainer icon indicating copy to clipboard operation
nntrainer copied to clipboard

Support sharing labels for loss layers

Open kparichay opened this issue 3 years ago • 5 comments

Sharing a given label with the loss layers of the model is not fully supported.

  • when only a single label is given, it is shared with all the labels
  • when mulitple labels and losses exists, the mapping of labels to losses is missing

kparichay avatar Dec 06 '21 06:12 kparichay

:octocat: cibot: Thank you for posting issue #1762. The person in charge will reply soon.

taos-ci avatar Dec 06 '21 06:12 taos-ci

when mulitple labels and losses exists, the mapping of labels to losses is missing

Could you check if label_layers= would do the job? Although it's layer based (not connection based) support has been implemented.

Or is that the case if label is to consumed more than once?

[model] #describe input and label inside the section
input_layers=Layer1,Layer2,Layer3
label_layers=Layer3,Layer4,Layer5

zhoonit avatar Dec 09 '21 12:12 zhoonit

Or is that the case if label is to consumed more than once?

This is the case. Given 2 labels tensors, they must be shared among a few given output layers.

kparichay avatar Dec 09 '21 13:12 kparichay

If this is the case,

what we can do is we specify one label_layer and two loss layer refer to the layer. Just throwing an idea for now.

zhoonit avatar Dec 10 '21 02:12 zhoonit

If this is the case,

what we can do is we specify one label_layer and two loss layer refer to the layer. Just throwing an idea for now.

The interface for the loss layer to refer to the label_layer is missing. Either input_layers must be extended for each layer to connect to labels as well (which would also require labels to be treated as input in layer context which they aren't) or create yet another property like label_layers for this connection.

kparichay avatar Dec 10 '21 03:12 kparichay