KGCN icon indicating copy to clipboard operation
KGCN copied to clipboard

Is KGCN for Recommender System is Inductive in Nature ?

Open sachinsharma9780 opened this issue 2 years ago • 12 comments

Hi,

I am going through the literature of the paper and one thing which I find missing is information about inductiveness of the proposed algorithm.

So my question is, Is the proposed architecture inductive in nature i.e. it can generalise to new users as well without retraining?

Thanks Sachin

sachinsharma9780 avatar Jan 24 '22 15:01 sachinsharma9780

Hi Sachin,

Thanks for your interest in our work! Our method is item-inductive but not user-inductive, because we have an item KG which can help us calculate the representation of a new item, but we do not have such KG at the user end.

hwwang55 avatar Jan 25 '22 17:01 hwwang55

Thank you for the response @hwwang55 .

So if I add a new user (u1) in the interaction matrix (Y), lets say with some engagements and now we want to find out if the new user (u1) will engages with the movie lets say "Titanic" and the titanic movie is already there in the KG then in this case can't we generate user specific (u1) movie (titanic) embeddings?

I am just thinking under the perspective if we want to build out movie recommendation application using your proposed algorithm where we somehow recommend movies to new users without retraining the KGCN algo.

sachinsharma9780 avatar Jan 25 '22 18:01 sachinsharma9780

It depends on how you design user embeddings. If user embeddings are randomly initialized embedding vectors, you cannot deal with the cold start problem. If user embeddings are based on user features, e.g., output by an MLP that takes users' initial feature as input, then you can do the inference without re-training the model. Thanks!

hwwang55 avatar Jan 25 '22 22:01 hwwang55

I am going through your other paper "Knowledge-aware Graph Neural Networks with Label Smoothness Regularization for Recommender Systems" which claims that Label Smoothness adds inductive bias in algorithm.

Can this algorithm generalise to new users without retraining?

sachinsharma9780 avatar Jan 26 '22 16:01 sachinsharma9780

You still need an MLP to calculate user embeddings in KGNN-LS

hwwang55 avatar Jan 26 '22 19:01 hwwang55

Thanks for clarifying. So in the paper user embeddings are are randomly initialized embedding vectors, isn't it ?

sachinsharma9780 avatar Jan 26 '22 22:01 sachinsharma9780

Correct. You can of course calculate user embeddings using their initial features if available.

hwwang55 avatar Jan 26 '22 23:01 hwwang55

Just a question out of curiosity;

So if the user features are available (e.g. demographics, sex, etc) then we can create user embeddings via MLP. Afterwards how we can use these embeddings to generate recommendations for a new user ?

sachinsharma9780 avatar Jan 26 '22 23:01 sachinsharma9780

The another main difficulty is to find a standard dataset which provides the information about user feature like demographics, etc. However, I dont think user information is provided by any standard dataset.

sachinsharma9780 avatar Jan 26 '22 23:01 sachinsharma9780

Once you have user embedding, you can use it to calculate the user-specific adjacency matrix, then running GCN on this adjacency matrix. Item embeddings are contained in the output of the GCN. Finally you can predict user engagement labels using user and item embeddings.

hwwang55 avatar Jan 26 '22 23:01 hwwang55

Once you have user embedding, you can use it to calculate the user-specific adjacency matrix, then running GCN on this adjacency matrix. Item embeddings are contained in the output of the GCN. Finally you can predict user engagement labels using user and item embeddings.

In this case our model needs to be trained on user specific side information?

sachinsharma9780 avatar Feb 08 '22 15:02 sachinsharma9780

@sachinsharma9780 are you able to create the kg for a new dataset? Can you list steps, if you don't mind?

rituk avatar Apr 27 '22 00:04 rituk