dhzyingz

Results 1 issues of dhzyingz

In the backpropagation part, the first line code writes: dtanh = softmaxOutput.diff(forward[len(forward)-1][2], y) So it is activated then sent to softmax? I guess for the last layer there is no...