pytorch-llama icon indicating copy to clipboard operation
pytorch-llama copied to clipboard

No need to use forward method?

Open nkkbr opened this issue 1 year ago • 0 comments

https://github.com/hkproj/pytorch-llama/blob/067f8a37fe36ac8b52dca9cc6f2a2e8d6aa372d6/model.py#L230-L235

No need to use forward method? I mean, we could use nn.Module directly.

h = x + self.attention(self.attention_norm(x), start_pos, freqs_complex)
out = h + self.feed_forward(self.ffn_norm(h))  

nkkbr avatar Jun 18 '24 11:06 nkkbr