pytorch-widedeep icon indicating copy to clipboard operation
pytorch-widedeep copied to clipboard

how to implement target attention in this framework

Open LeiShenVictoria opened this issue 1 year ago • 3 comments

In DeepInterestNetwork, there is a target attention between a candidate feature (one column) and a sequence feature, how to implement this target attention in this repo, which can be considered as an attention between a column in deep part (candidate) and the text part (sequence) i guess... Thanks a lot

LeiShenVictoria avatar Apr 25 '24 15:04 LeiShenVictoria

Hey @LeiShenVictoria

So I have not read the Deep Interest Network paper, I will, maybe I can incorporate some ideas to the library.

As of right now, the only thing "kind-of" similar you would have here are the attention weights of the models.

All model components that are based on attention mechanisms have an attribute called attention_weights: see here

I will have a look to the paper Deep Interest Network paper asap and see if I can come up with a quick answer that is more helpful :)

jrzaurin avatar Apr 26 '24 09:04 jrzaurin

Hi, thanks for your reply. One more question is that how to implement the embedding-sharing operation for a candidate feature and a sequence feature.

LeiShenVictoria avatar May 06 '24 09:05 LeiShenVictoria

Hey @LeiShenVictoria

I would have to read the paper :)

I am busy at work now, but ill see what I can do asap

jrzaurin avatar May 09 '24 21:05 jrzaurin

Hi @LeiShenVictoria

There is a brach called ffm now where DIN is implemented

In the examples folder there is a script called movielens_din.py with an example 🙂.

I know the issue was opened a while ago, but took me time to find the time.

Let me know if you have any questions

jrzaurin avatar Sep 19 '24 19:09 jrzaurin

Covered in PR #234

jrzaurin avatar Sep 24 '24 20:09 jrzaurin