nntrainer
nntrainer copied to clipboard
Support sharing labels for loss layers
Sharing a given label with the loss layers of the model is not fully supported.
- when only a single label is given, it is shared with all the labels
- when mulitple labels and losses exists, the mapping of labels to losses is missing
:octocat: cibot: Thank you for posting issue #1762. The person in charge will reply soon.
when mulitple labels and losses exists, the mapping of labels to losses is missing
Could you check if label_layers=
would do the job? Although it's layer based (not connection based) support has been implemented.
Or is that the case if label is to consumed more than once?
[model] #describe input and label inside the section
input_layers=Layer1,Layer2,Layer3
label_layers=Layer3,Layer4,Layer5
Or is that the case if label is to consumed more than once?
This is the case. Given 2 labels tensors, they must be shared among a few given output layers.
If this is the case,
what we can do is we specify one label_layer and two loss layer refer to the layer. Just throwing an idea for now.
If this is the case,
what we can do is we specify one label_layer and two loss layer refer to the layer. Just throwing an idea for now.
The interface for the loss layer to refer to the label_layer is missing. Either input_layers
must be extended for each layer to connect to labels as well (which would also require labels to be treated as input in layer context which they aren't) or create yet another property like label_layers
for this connection.