FPD icon indicating copy to clipboard operation
FPD copied to clipboard

code problem

Open wml666666 opened this issue 1 year ago • 5 comments
trafficstars

Excuse me, while reading the code in your ffa.py file, there is a line of code “attn.div_(0.5)” in the forward function of the PrototypesAssignment class, which you did not mention in your paper. Could you please help me answer this? Thank you

wml666666 avatar Mar 16 '24 07:03 wml666666

Hello, this operation can scale up attention values. With the subsequent softmax function, it leads to a sharper focus on the key elements that the attention mechanism is trying to highlight. However in our experiments, this operation didn't make a significant difference.

wangchen1801 avatar Mar 16 '24 13:03 wangchen1801

Okay, I understand. Thank you for your reply!

wml666666 avatar Mar 17 '24 02:03 wml666666

Hello, sorry to bother you. Formula (2) in your paper mentions Ecls (class embedding), but when I studied your code, I did not find this variable in ffa.py.

wml666666 avatar Mar 21 '24 11:03 wml666666

In our experiments, feature queries are class-agnostic initially. It is reasonable to assign a class embedding to each category for discrimination. However, in the final experiment, each category has its exclusive feature queries, making the class embedding redundant. Therefore, we removed it to simplify our code. Sorry for the confusion and hope it can clarifies this issue.

wangchen1801 avatar Mar 22 '24 02:03 wangchen1801

Okay, thank you for your reply.

wml666666 avatar Mar 22 '24 02:03 wml666666