Retrieval_Head icon indicating copy to clipboard operation
Retrieval_Head copied to clipboard

Question about blocking an attention head

Open cnlnpjhsy opened this issue 1 year ago • 1 comments

https://github.com/nightdessert/Retrieval_Head/blob/3ac171a6f71ce7ef1cda57d4215c390fb6ab51f2/faiss_attn/source/modeling_llama.py#L685-L689

Hi there, I'm working on the interpretability of attention heads and your work is really inspiring. I'm not pro in modifications on these attention heads, so I'm a little confused about this part of the code.

If I want to block an attention head, I suppose that one should set the attention weight of that head to "-inf", just like what the attention mask do, but here the attention weight is set to zero. As the weights may contain negative values, could this zero-weight fully block the attention of that head? Hopefully I haven't missed something. Thank you!

cnlnpjhsy avatar Aug 05 '24 08:08 cnlnpjhsy

I experimented with -inf and this is correct way to mask out attention head. I see the retrieval scores associated with head is 0 so I guess with -inf works!

shaswatpatel123 avatar Apr 28 '25 18:04 shaswatpatel123