transformer
transformer copied to clipboard
shaollow copy
this is a shaollow copy, makes the "_x" and "x" totally the same one? https://github.com/hyunwoongko/transformer/blob/0e5ce57589d7307cf76b53241cc523841ff67655/models/blocks/encoder_layer.py#L27
It does not matter since the code does not modify x
or _x
in-place. In the next row, x
is overwritten by the output of the attention