LLaDA icon indicating copy to clipboard operation
LLaDA copied to clipboard

Feature Request: Enable output_attentions=True for LLaDA Models

Open mli746 opened this issue 2 months ago • 1 comments

Hi LLaDA Team,

First off, thanks for developing and sharing the LLaDA models!

We were wondering if it would be possible to enable support for the standard Hugging Face output_attentions=True flag? We noticed it currently raises a ValueError.

Accessing attention scores is incredibly helpful for model interpretability, research, and debugging, offering deeper insights into model workings. Enabling this standard feature would greatly benefit the community.

From the code, it looks like this might involve removing the ValueError check. We understand that extracting scores might require specific handling, especially given the use of optimized kernels like FlashAttention, but hope it's feasible to expose them.

Thank you for your great work on LLaDA and for considering this feature request!

mli746 avatar Oct 24 '25 04:10 mli746

See implementation here

Kamichanw avatar Oct 28 '25 03:10 Kamichanw