pytorch-llama
pytorch-llama copied to clipboard
No need to use forward method?
https://github.com/hkproj/pytorch-llama/blob/067f8a37fe36ac8b52dca9cc6f2a2e8d6aa372d6/model.py#L230-L235
No need to use forward method? I mean, we could use nn.Module directly.
h = x + self.attention(self.attention_norm(x), start_pos, freqs_complex)
out = h + self.feed_forward(self.ffn_norm(h))