neural_collaborative_filtering
neural_collaborative_filtering copied to clipboard
the number of negative instances?
HI,anybody can tell me why you just select 4 negative instances, i think it is a too small proportion. is there any influence in training?thanks!!! '--num_neg', type=int, default=4
In 'neural collaborative filtering' paper (https://www.comp.nus.edu.sg/~xiangnan/papers/ncf.pdf), there are graph about number of negatives. 4 negatives instances showed good result and as my experience, when you select 10 or 20 neg, too many computation (I'm using Tesla V100 GPU) On my dataset, neg 6 show best performance. I think you should tune on your dataset. :)
oh sorry! i make a mistake,the mean of code is one positive instance for 4 negative instances. i consider all positive and 4 negative before.thanks a lot!!!
@fjz15056311771 good luck :)
Hi kyung-wook! sorry for disturb you again, i have a little problem about the way of getting negative sample,you know in Matrix factorization,they don't use unobserved sample to update matrix. they just use confirmed relationship between user and item,i am confused that whether it is reasonable use random way to get negative sample(unobserved just means don't know whether they have relation ),can you explain it or give me some paper for it? thanks !!!
@fjz15056311771 Sorry for late reply Right. Existing matrix factorization use only confirmed relationship between user and item. Yet, it can cause limitation of matrix factorization. In implicit data, we only have positive data because the data came from NOT rating of user BUT only the interaction. This means we don't know whether a user dislike the item or just unknown of it. So in this model, negative sample is not literally NEGATIVE, rather it just has broader meaning (dislike or unknown or other meaning) It's all my personal thought. And you can find it in 2.1 Learning from Implicit Data in 'neural collaborative filtering' paper (https://www.comp.nus.edu.sg/~xiangnan/papers/ncf.pdf)
@kyung-wook thanks!! i think BPR loss solve my confusion.