returnn
returnn copied to clipboard
Refactor losses into layers
Multiple things:
-
Lossbecomes a subclass ofLayerBase - Loss instances will be treated as normal layers, and the name logic for moving them out of rec loop etc will apply.
This should greatly cleanup the complexity we currently have with LayerBase.get_losses and LossHolder.
This should also fix some bugs along the way, e.g. #556.
Note that with RETURNN-common, this is not so much an issue anymore, as RETURNN common only uses the AsIsLoss, and all losses are already defined via normal layers.