magic_init
magic_init copied to clipboard
Loss or `force_backward: true` required
When not making use of the -d flag it's necessary to define a loss or remember to include force_backward: true in the net definition or else all gradients will be zero. It might be nice to check if the diff is all zero and suggest these remedies.