hgru4rec icon indicating copy to clipboard operation
hgru4rec copied to clipboard

Concatenating Additional Features

Open danielressi opened this issue 6 years ago • 6 comments

Hi,

Your paper is really interesting and thank you for providing your code. I am currently trying to add additional features to the RNN input. I don't get any errors but the gpu utilisation drops massively and the computations get very slow. All I do is concatenating a batchsize x nr features tensor.matrix to embedding (SE_item) and provide the input respectively.

My questions are: Have you tried something similar? Am I missing something in the concatenation? do I need to adapt "Sin" aswell?

Code snippet of concatenation: SE_item = self.E_item[X] # sampled item embedding input_vec = T.concatenate([SE_item, X_additional], axis=1) # X_additonal -> tensor.matrix vec = T.dot(input_vec, self.Ws_in[0]) + self.Bs_h[0] Sin = SE_item

Thank you so much for your help

danielressi avatar Feb 01 '19 10:02 danielressi

Hi @danielressi ,

We tried something similar in the past in our Recsys 2016 paper on Parallel Recurrent Neural Networks. Simple input concatenation may be not best option, both in terms of speed and recommendation performance. A better option is to have "feature-specific" RNNs and use late feature fusion. The paper shows also that alternated training can be beneficial too.

Hope it helps

mquad avatar Feb 06 '19 10:02 mquad

@mquad thank you so much for your reply. I will have a look at the paper and try out late fusing.

danielressi avatar Feb 06 '19 10:02 danielressi

@mquad Is there a repository for "Parallel Recurrent Neural Networks" implementation ?

mmaher22 avatar Oct 16 '19 13:10 mmaher22

I'm sorry but there's no public implementation of that paper AFAIK

mquad avatar Oct 24 '19 07:10 mquad

Another relevant question: What are the "item_embedding" and "init_item_embeddings" options used for? Can we use these options to input the additional features about the items?

Thanks!

item_embedding: int
        size of the item embedding vector (default: None)
init_item_embeddings: 2D array or dict
        array with the initial values of the embeddings vector of every item,
        or dict that maps each item id to its embedding vector (default: None)

qingpeng avatar Jan 23 '20 23:01 qingpeng

Now I figure out that I can use the "init_item_embeddings" options to load the item features/vectors. But is there any way to use the user features/vectors in the package? Thanks!

qingpeng avatar Jan 24 '20 07:01 qingpeng