nntrainer icon indicating copy to clipboard operation
nntrainer copied to clipboard

[layer] Memory optimization for backwarding of embedding layer

Open skykongkong8 opened this issue 11 months ago • 1 comments

Unlike other layers, embedding layer uses specific form of Tensor called IndexedSlices and this data contains the specific indicies of the Tensor that we are interested in. Thus, during the backwarding process, we do not have to set all the value-tensor-shaped gradient in VarGrad, but we can optimize it by formulating the specific part for the Gradient Tensor. In current NNTrainer code, there is no such consideration like above, but it use same-shaped but zero-filled Tensor for un-interested-indexed portion of the Tensor. (redundant size Tensor declaration) As far as I am concerned, we should work on this part in the near future for the memory optimization purpose.

skykongkong8 avatar Aug 29 '23 02:08 skykongkong8