nntrainer
nntrainer copied to clipboard
Handling addReferences & input realizers and recurrent end_layers
current neuralnet::addWIthReferenceLayers becomes too complicated that it is impossible to understand, this thread discuss how to untangle a bit to make it more intuitive + some missing functionalities needed
- start and input layers resolution
- start layers, input layers, end layers represents connection
- start layers <-> input layers 1:1 maps with connection. (does not do orphan substitution etc...)
- recurrent end layers can have partial output as sequence, rest as not
- make sequence of intermediate output and alias it to the end layer name #1793
- A(2:) kind of way to connect rest
:octocat: cibot: Thank you for posting issue #1796. The person in charge will reply soon.