nhatkhtn
nhatkhtn
### System Info For the current version (4.52.4), in the `LlamaAttention` class, the type hint for the forward function https://github.com/huggingface/transformers/blob/aa798b7ac9ff5018b3578eb927dc438671ab6a3e/src/transformers/models/llama/modeling_llama.py#L231 and what it actually returns https://github.com/huggingface/transformers/blob/aa798b7ac9ff5018b3578eb927dc438671ab6a3e/src/transformers/models/llama/modeling_llama.py#L264 mismatch. Looking at the...
**Describe the bug** Using LLaVA with TransformerLens, at least in the manner shown in the [official demo](https://github.com/TransformerLensOrg/TransformerLens/blob/main/demos/LLaVA.ipynb), leads to errors/mismatched outputs. **Code example** Run the official demo notebook. **System Info**...