nos
nos copied to clipboard
Add support to SelectiveBackprop
SelectiveBackprop description: Acceleration is achieved by prioritizing examples with high loss at each iteration. This means using the output of a training example’s forward pass to decide whether to use that example to compute gradients and update parameters, or to skip immediately to the next example. By reducing the number of computationally-expensive back-propagation steps performed, nebulgym accelerates training. Acceleration is be achieved by using stale forward pass results for selection, thus also skipping forward passes of low priority examples.
We should add SelectiveBackprop to the patching backprop technologies. Further info about it can be found on the original paper