minllama-assignment icon indicating copy to clipboard operation
minllama-assignment copied to clipboard

Results 2 minllama-assignment issues
Sort by recently updated
recently updated
newest added

In `structure.md` llama.Attention.forward -> llama.Attention.compute_query_key_value_scores llama.Llama.forward -> llama.LlamaLayer.forward Rectified some functions in `To be implemented` section In `rope_test.py` Use Tuple from typing instead of tuple as it is not subscriptable...

there is no causal mask in the attention layer. Is it because the model is designed for classification rather than generation?